Image super-resolution reconstruction method for generating antagonistic network based on feature fusion

A technology of super-resolution reconstruction and feature fusion, applied in the field of image reconstruction of generative adversarial networks, can solve the problems of poor image performance and insufficient edge detail information of reconstructed images, so as to achieve clear image edge and detail information and better reconstruction effect Good, reduce the effect of computational complexity

Active Publication Date: 2019-03-22
DALIAN MARITIME UNIVERSITY
View PDF2 Cites 48 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The main problem is that the edge details of the reconstructed image are insufficient, and the performance of the final image is not good.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image super-resolution reconstruction method for generating antagonistic network based on feature fusion
  • Image super-resolution reconstruction method for generating antagonistic network based on feature fusion
  • Image super-resolution reconstruction method for generating antagonistic network based on feature fusion

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0036] As shown in references 1 and 2, an image super-resolution reconstruction method based on a feature fusion generative confrontation network includes the following steps:

[0037] A. Preprocess the ImageNet data set to obtain the reconstruction data set corresponding to the high and low resolution images;

[0038] B. Construct a generation confrontation network model for training, introduce an interpolation reconstruction module into the model, and construct a generation network and a perception network for multi-feature fusion;

[0039] C. Input the reconstructed data set obtained in step A into the generative confrontation network in turn for model training;

[0040] D. Normalize the image to be processed to obtain a low-resolution image, input it to the trained generation network, and obtain a reconstructed high-resolution image.

[0041] Further, the method for making the reconstructed data set described in step A is:

[0042] A1. Obtain the ImageNet data set, and r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an image super-resolution reconstruction method for generating antagonistic network based on feature fusion, which comprises the following steps: pre-processing ImageNet data set to obtain a reconstruction data set corresponding to high-resolution image and low-resolution image; The model of generating antagonistic network for training is constructed, and the interpolationreconstruction module is introduced into the model to construct the generating network and perceptual network for multi-feature fusion. The reconstructed data sets are sequentially input into the generated antagonistic network for model training. The images to be processed are normalized to obtain low-resolution images, which are input to the trained generation network to obtain the reconstructedhigh-resolution images. As that recursive residual network is utilize to extract multiple features such as edge and texture, highlight the edge and texture information of the generated image, so thatthat reconstruct image is clearer, The interpolation reconstruction module is used as a mediator to realize the residual discrimination, highlight the difference between the reconstructed and real high-resolution images and reduce the computational complexity of the discrimination network, and use VGG network to fuse multiple features to calculate the loss function, which makes the reconstructed image more effective and more consistent with the observation mode of human eyes.

Description

technical field [0001] The present invention relates to the field of image reconstruction methods, in particular to an image reconstruction method based on a feature fusion generating confrontation network. Background technique [0002] Super-resolution reconstruction (SR) is a technique for recovering corresponding high-resolution images from low-resolution images. The acquisition and processing of digital images will inevitably weaken the resolution of images, thus affecting the application of multimedia. Therefore, how to convert low-quality, limited-resolution images into high-quality, high-resolution images has always been a hot research issue involving social production and national defense and military fields. [0003] At present, super-resolution reconstruction algorithms can be mainly divided into three categories: methods based on interpolation reconstruction, methods based on reconstruction and methods based on learning. The method based on interpolation reconst...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/40G06N3/04
CPCG06T3/4076G06N3/045
Inventor 王琳杨思琦
Owner DALIAN MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products