Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Infrared light and visible light image fusion method based on residual dense network and gradient loss

A dense network, image fusion technology, applied in the field of image processing, which can solve problems such as reduced training speed, low quality of fusion results, and blurred edges.

Pending Publication Date: 2021-11-02
DALIAN UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These methods can accurately extract feature information, but require complex matrix operations, and the problem of edge blur still exists
In recent research, methods based on deep learning: DeepFuse, SESF Fuse and Attention FGAN can improve the shortcomings of traditional methods, but there are also some limitations
First, deep learning networks usually extract feature maps directly from the previous convolutional layer, ignoring global information, resulting in low-quality fusion results
Second, some methods do not employ an end-to-end model, but an encoder-decoder model with a fusion strategy
Too simple a fusion mechanism may make the edges of the image unclear
Finally, the design of the loss function will also affect the results of network training
If an appropriate loss function is not chosen for network training, the value of the loss function may converge slower and the training speed may decrease

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Infrared light and visible light image fusion method based on residual dense network and gradient loss
  • Infrared light and visible light image fusion method based on residual dense network and gradient loss
  • Infrared light and visible light image fusion method based on residual dense network and gradient loss

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] Such as figure 1 As shown, this embodiment provides an infrared and visible light image fusion method based on residual dense network and gradient loss, and the specific steps are as follows:

[0048] Step 1: Put the infrared light image and visible light image into the pre-trained VGG-16 network for parameter extraction, and standardize and normalize the extracted feature map values ​​to obtain weight blocks;

[0049] Specifically, when the source image is input to the network, the source image is also passed to a specific extractor for extracting feature maps and calculating feature values ​​for the purpose of generating parameters that need to be used in the training process. It is set that each source image extracts 5 feature maps before the 5 maximum pooling layers of the VGG-16 network. After extraction, the information contained in the feature map starts to be calculated, and their gradients are used in the calculation to work well with the gradient loss in the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an infrared light and visible light image fusion method based on a residual dense network and gradient loss, and the method comprises the steps: putting an infrared light image and a visible light image into a pre-trained VGG-16 network, carrying out parameter extraction, and carrying out standardization and normalization of the extracted feature image values, and obtaining a weight block; putting the infrared light image and the visible light image into a network model for end-to-end unsupervised learning to generate a fused image model; training the fusion image model, in the training process, using a loss function for continuously conducting back propagation iteration, updating the learning content, wherein the loss function is based on the weight block optimization gradient; Fusion can be carried out on the existing public infrared light and visible light image data set, the edge information of the infrared light image and the texture details of the visible light image can be effectively extracted from the fused result, and the edge information of the infrared light image and the texture details of the visible light image are combined to form a fused image better conforming to the visual effect of human eyes.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to an infrared and visible light image fusion method based on residual dense network and gradient loss. Background technique [0002] Image fusion plays a pivotal role in computer vision, which extracts key information from different input images, synthesizes them together, and outputs a fused image with better visual effects. Infrared light and visible light image fusion is an important branch of image fusion. The infrared sensor can monitor the object around the clock, and generate an infrared image through the thermal radiation emitted by the object, and has excellent performance on the outline of the object in various environments. However, infrared images have poor texture details and low resolution. Visible light images have rich texture information and high spatial resolution, which can reflect the real environment of the detected object and adapt to the human visu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06N3/04G06N3/08
CPCG06N3/084G06N3/047G06N3/045G06F18/25
Inventor 周士华李嘉伟王宾
Owner DALIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products