Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Super resolution reconstruction method based on total variation difference and convolution nerve network

A convolutional neural network and super-resolution reconstruction technology, which is applied in the direction of instruments, graphics and image conversion, calculation, etc., can solve the problems of low calculation amount, high computer complexity, poor reconstruction effect, etc., and achieve low calculation amount and simplified Computational complexity, the effect of suppressing the sawtooth effect

Inactive Publication Date: 2016-12-07
PEKING UNIV
View PDF1 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Several existing reconstruction methods have the following defects: the method based on interpolation has low calculation, but the reconstruction effect is poor; the method based on reconstruction cannot reconstruct both the edge and the texture well at the same time; the method based on learning The method is computationally complex and has a strong dependence on the choice of training library

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Super resolution reconstruction method based on total variation difference and convolution nerve network
  • Super resolution reconstruction method based on total variation difference and convolution nerve network
  • Super resolution reconstruction method based on total variation difference and convolution nerve network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0045] Such as figure 1 As shown, a super-resolution reconstruction method based on total variable difference and convolutional neural network includes the following steps:

[0046] Image decomposition is carried out in step S1, an image f is decomposed into structure part u and texture part v, f=u+v. The structure part is relatively smooth and has sharp edges, while the texture part contains the texture and details of the image. Decomposition uses a method based on the difference in total variation. The total variation difference refers to the sum of the degree of change of the signal. For a two-dimensional image, the total variation difference is the sum of the gradients of the image. The problem of image decomposition is solved by solving the following minimization equation:

[0047] min u ∫ [ ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a super resolution reconstruction method based on a total variation difference and a convolution nerve network. The method comprises an image decomposition step of decomposing an original low resolution image into a structure part and a texture part by using a method based on total variation difference, a structure part image amplification step of using a linear interpolation to amplify the structure part to obtain an initial amplification image, using a sharpening filter to sharpen an edge and finally carrying out result correction, a texture part image reconstruction step of using a linear interpolation to amplify the texture part, inputting an amplified image into the convolution nerve network, and carrying out operation to obtain a reconstructed texture image, and an image combination step of combining the amplified texture part image and the reconstructed texture image to generate a final super resolution image. According to the super resolution reconstruction image of the invention, the edge and structures of the image are maintained at the same time, the operation complexity is reduced, and the requirement of real-time performance is satisfied.

Description

technical field [0001] The invention belongs to the technical field of video image processing, in particular to a super-resolution reconstruction method based on total variable difference and convolutional neural network. Background technique [0002] With the popularization of digital products, images, as the main source of information for human beings, have been more and more widely used. At the same time, digital image processing technology has also been developed rapidly. The acquisition of video images is a key step in the digital image processing system. In the process of digital acquisition, the image resolution and image quality will be reduced due to the following factors: Sampling frequency - undersampling makes the frequency spectrum of the image aliased and degraded due to deformation effects; atmospheric disturbance, defocusing, The size of the sensor and the relative motion between the image acquisition device and the object to be photographed will cause imag...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T3/40
CPCG06T3/4007G06T3/4046G06T3/4053
Inventor 贾惠柱杨帆解晓东杨长水陈瑞
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products