Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An image super-resolution reconstruction method based on a dense feature fusion network

A technology of super-resolution reconstruction and feature fusion, applied in image enhancement, image data processing, graphics and image conversion, etc., can solve the problems of loss of low-resolution image details, increased computing load, artifacts, etc., and achieve high resolution Restore image details, improve quality and accuracy, and reduce noise

Pending Publication Date: 2019-06-21
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although these super-resolution methods based on deep learning have improved the image reconstruction quality and algorithm efficiency to a certain extent, there are still deficiencies. In order to achieve the target image spatial resolution, the existing methods choose to directly process the input low-resolution image. Preprocessing for interpolation and amplification
For the very deep neural network, this approach not only increases the computational load, but also causes some details of the original low-resolution image to be lost.
Secondly, the reconstructed image is too blurry or smooth, the details are not restored enough, and even artifacts are generated.
These last methods only improve performance by increasing depth, which not only has little effect, but also makes network training difficult

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An image super-resolution reconstruction method based on a dense feature fusion network
  • An image super-resolution reconstruction method based on a dense feature fusion network
  • An image super-resolution reconstruction method based on a dense feature fusion network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be further described below in conjunction with the embodiments and accompanying drawings, but the present invention is not limited thereto.

[0028] Examples of the present invention figure 1 The following steps are shown:

[0029] 1) Data preprocessing: the original color image is numerically normalized to [0,1], and the original color image is interpolated and scaled according to different magnification ratios to generate data sets with different magnification ratios for subsequent model training .

[0030] 2) Establish reconstruction model: such as figure 2 As shown, the reconstruction model includes a coarse feature extraction network, a dense feature fusion network and an image reconstruction network. The coarse feature extraction network consists of a layer of convolutional neural network for extracting coarse features F directly from the original low-resolution image 0 , whose number of features is 32.

[0031] f 0 =W coarse ×Im...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image super-resolution reconstruction method based on a dense feature fusion network. The method comprises the following steps: 1) preprocessing data; 2) establishing an image super-resolution reconstruction model; And 3) inputting the to-be-processed image into the model to obtain a high-resolution image. The image super-resolution reconstruction model comprises a coarse feature extraction network, a dense feature fusion network and an image reconstruction network; the coarse feature extraction network is used for extracting coarse image features of a low-resolutioncolor image; the dense feature fusion network is used for extracting high-order image features from the coarse image features; And the image reconstruction network is used for adding and fusing the coarse image features and the high-order image features to obtain dense image features, and then reconstructing the dense image features to obtain a color high-resolution image. According to the method, noise caused by a traditional interpolation amplification super-resolution algorithm can be effectively reduced, more high-frequency information is obtained to realize high-resolution image detail restoration, and the precision of super-resolution reconstruction is improved.

Description

technical field [0001] The invention is an image super-resolution reconstruction method based on a dense feature fusion network, which belongs to the field of image processing. Background technique [0002] The image super-resolution problem, especially the single image super-resolution (SISR, Single Image SuperResolution) problem is a classic problem in computer vision. It aims to obtain a visually pleasing high resolution (HR, High Resolution) image from a single low resolution (LR, Low Resolution) image produced by a low-cost imaging system and limited environmental conditions. Since super-resolution reconstruction does not need to consider too much cost, it is widely used in surveillance video, medical imaging, face recognition and other scenarios. [0003] Methods for SISR problems can be roughly divided into three categories: interpolation-based methods, reconstruction-based methods, and learning-based methods. Among these methods, the super-resolution reconstruction...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/40G06T5/00G06K9/62
Inventor 徐旺陈仁文黄斌张祥周秦邦刘川
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products