Infrared and visible light image fusion method and system based on generative adversarial network

An infrared image and image fusion technology, applied in the field of infrared and visible light image fusion, can solve the problems of inapplicability of infrared and visible light images, limit the development of fusion methods, and complexity, so as to avoid manual design and complex design, improve fusion performance, and achieve good results. The effect of visual effects

Pending Publication Date: 2020-11-27
SHANDONG NORMAL UNIV
View PDF0 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These three constraints have become increasingly complex, especially for manually designing fusion rules, which strongly limits the development of fusion methods
In addition, the existing fusion methods usually select the same salient features as the original image, such as edges and lines, to be fused into the fused image, so that the fused image contains more detailed information.
However, the above methods may not be suitable for the fusion of infrared and visible light images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Infrared and visible light image fusion method and system based on generative adversarial network
  • Infrared and visible light image fusion method and system based on generative adversarial network
  • Infrared and visible light image fusion method and system based on generative adversarial network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] This embodiment provides an infrared and visible light image fusion method based on generating an adversarial network;

[0049] Infrared and visible light image fusion methods based on generative adversarial networks, including:

[0050] S101: Acquire an infrared image and a visible light image to be fused;

[0051] S102: Simultaneously input the infrared image and the visible light image to be fused into the pre-trained generation adversarial network, and output the fused image; as one or more embodiments, the generation adversarial network includes: Loss function; the overall loss function includes: content loss function, detail loss function, target edge enhancement loss function and adversarial loss function.

[0052] As one or more embodiments, the generated confrontational network includes: a generator and a discriminator; the generator refers to a Resnet classification network; the discriminator refers to a VGG-Net neural network.

[0053] Further, the content ...

Embodiment 2

[0143] Embodiment 2 This embodiment provides an infrared and visible light image fusion system based on generating an adversarial network; an infrared and visible light image fusion system based on generating an adversarial network, including:

[0144] An acquisition module configured to: acquire an infrared image and a visible light image to be fused;

[0145] A fusion module, which is configured to: simultaneously input the infrared image and the visible light image to be fused into a pre-trained generation adversarial network, and output the fused image; as one or more embodiments, the generation adversarial The network includes: a total loss function; the total loss function includes: a content loss function, a detail loss function, an object edge enhancement loss function and an adversarial loss function.

[0146] It should be noted here that the acquisition module and the fusion module above correspond to steps S101 to S102 in the first embodiment, and the examples and a...

Embodiment 3

[0147] Embodiment 3 This embodiment also provides an electronic device, including: one or more processors, one or more memories, and one or more computer programs; wherein, the processor is connected to the memory, and the above one or more The computer program is stored in the memory, and when the electronic device is running, the processor executes one or more computer programs stored in the memory, so that the electronic device executes the method described in Embodiment 1 above. Embodiment 4 This embodiment also provides a computer-readable storage medium for storing computer instructions. When the computer instructions are executed by a processor, the method described in Embodiment 1 is completed.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an infrared and visible light image fusion method and system based on a generative adversarial network. The method comprises the steps that an infrared image and a visible light image to be fused are collected; inputting the infrared image and the visible light image to be fused into a pre-trained generative adversarial network at the same time, and outputting a fused image; as one or more embodiments, the generative adversarial network includes: a total loss function; wherein the total loss function comprises a content loss function, a detail loss function, a target edge enhancement loss function and a resistance loss function.

Description

technical field [0001] The present application relates to the technical field of image processing, in particular to a fusion method and system for infrared and visible light images based on generative adversarial networks. Background technique [0002] The statements in this section merely mention the background art related to this application, and do not necessarily constitute the prior art. [0003] Infrared images are captured by infrared sensors and are used to record the thermal radiation emitted by different targets, which are widely used in target detection and surface parameter inversion. Infrared images are less affected by lighting changes and camouflage, and they are easy to capture both day and night. However, infrared images often lack texture and rarely affect the heat emitted by objects. In contrast, visible light images are captured and used to record features reflected by different objects, which contain discriminative feature information. Visible images ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T5/00
CPCG06T5/003G06T5/50G06T2207/10048G06T2207/20081G06T2207/20084G06T2207/20192G06T2207/20221
Inventor 隋晓丹王亚茹冯飞燕王雪梅许源丁维康赵艳娜
Owner SHANDONG NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products