Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image super-resolution method based on multi-scale detail feature fusion neural network

A deep neural network and detail feature technology, applied in the field of image super-resolution based on multi-scale detail feature fusion neural network, can solve the problems of high-frequency information loss, high-resolution image resolution can not meet the requirements, etc., to achieve increased The effect of stability

Active Publication Date: 2021-05-07
CHONGQING UNIV OF POSTS & TELECOMM
View PDF5 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The image super-resolution task aims to reconstruct the local detail features of the image. However, the deep neural network algorithm will have different degrees of loss of high-frequency information during the feature extraction process (most of the local detail features belong to high-frequency information), making the final high-frequency information Resolution The resolution of the image does not meet the requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image super-resolution method based on multi-scale detail feature fusion neural network
  • Image super-resolution method based on multi-scale detail feature fusion neural network
  • Image super-resolution method based on multi-scale detail feature fusion neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0032] An image super-resolution method based on multi-scale detailed feature fusion neural network, such as Figure 4 As shown, the method includes: obtaining an image to be processed, inputting the image to be processed into a trained improved deep neural network model, and obtaining a high-quality image;

[0033] The process of training the improved deep neural network model includes:

[0034] S1: Obtain the original image, preprocess the original image, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of image super-resolution reconstruction, and particularly relates to an image super-resolution method based on a multi-scale detail feature fusion neural network, and the method comprises the steps: obtaining a to-be-processed image, inputting the to-be-processed image into a trained improved deep neural network model, and obtaining a high-quality image; in the feature extraction process, high-frequency information in the original picture is increased, and the stability of the network is improved through the integration mode; in addition, in the aspect of a residual dense block, a multi-layer feature fusion mechanism is utilized, semantic information is added, and channel feature screening is also added, so that the whole network can be better expressed; moreover, in each stage of the feature extraction process, namely after each residual dense block, loss calculation is added for the back propagation adjustment feature extraction process, so that the expression ability of the network is improved, and high-resolution pictures can be better learned and reconstructed.

Description

technical field [0001] The invention belongs to the field of image super-resolution reconstruction, and in particular relates to an image super-resolution method based on multi-scale detail feature fusion neural network. Background technique [0002] With the rapid development of computer technology, information processing technology and visual communication technology, human beings have entered a new information age. The amount of knowledge that people can acquire is growing explosively, so it is urgent to continuously improve and develop information processing technology in order to provide people with more convenient, fast and diversified services. Digital image and its related processing technology is one of the important contents of information processing technology, and it has been more and more widely used in many fields. For digital images, high-resolution images are generally required in some cases, such as: medical images require the ability to display subtle lesi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/40G06T5/50
CPCG06T3/4053G06T5/50G06T2207/20081G06T2207/20084G06T2207/20221Y02T10/40
Inventor 戴大伟刘达张彬徐嘉夏书银王国胤
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products