Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Wavelet transform and joint sparse representation-based infrared and visible light image fusion method

A combined sparse and wavelet transform technology, applied in the field of infrared and visible light image fusion, can solve the problems of analyzing data, inaccurate representation of high-frequency information, data loss, etc.

Active Publication Date: 2017-11-10
NORTHWESTERN POLYTECHNICAL UNIV
View PDF1 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the fusion method based on joint sparse representation is based on dictionary learning and sparse representation under a certain threshold, so it cannot accurately describe detailed information such as texture and edges, which will lose some high-frequency information, resulting in a decrease in the clarity of the fusion result
[0005] As mentioned above, the fusion method based on wavelet transform can extract source image information in multiple scales and directions, but the sparsity of low-frequency sub-bands is not good, and its direct fusion is not conducive to the extraction and preservation of features; while the fusion method based on joint sparse representation Although the fusion method can finely fit the data by learning the dictionary, it cannot accurately represent some high-frequency information, that is, it cannot analyze the data in multiple scales and directions, so there is a certain data loss problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Wavelet transform and joint sparse representation-based infrared and visible light image fusion method
  • Wavelet transform and joint sparse representation-based infrared and visible light image fusion method
  • Wavelet transform and joint sparse representation-based infrared and visible light image fusion method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0049] Step 1: Perform wavelet transform on the source image

[0050] First read in the registered infrared source image I 1 Image with visible light source I 2 , and then choose the wavelet basis function, for the infrared source image I 1 Image with visible light source I 2 Perform s-level DWT transformation respectively, and decompose to obtain the infrared source image I 1 1 low-frequency sub-band and 3*s high-frequency sub-band and visible light source image I 2 1 low-frequency sub-band and 3*s high-frequency sub-bands;

[0051] Step 2: Fusion of low frequency subband coefficients based on joint sparse representation

[0052] Step 2.1 Sliding window to fetch blocks

[0053] In order from upper left to lower right, the low frequency subband coefficient C 1,l with C 2,l in steps of size is The sliding window block of the obtained size is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a wavelet transform and joint sparse representation-based infrared and invisible light image fusion method, and relates to the field of image fusion. The method comprises the following steps of: firstly, carrying out DWT transform on a source image, decomposing the source image into a low-frequency sub-band matrix coefficient and a high-frequency sub-band coefficient, decomposing the low-frequency sub-band coefficient into a matrix by using a sliding window strategy and learning a dictionary in allusion to the decomposed low-frequency sub-band matrix; secondly, respectively fusing the low-frequency sub-band coefficient and the high-frequency sub-band coefficient; and finally reconstructing a fused image through DWT inverse transform. According to the method, effective sparse representation can be carried out on outstanding detail features of source images, and multi-scale fusion can be carried out on detail information of the images, so that target information of infrared images and background information such as details, profiles and the like of invisible light images are well retained, the target identification ability is improved, and benefit is brought to subsequent processing systems to extract and use the information; and compared with the traditional wavelet transform-based fusion method and the existing joint sparse representation-based fusion method, method provided by the invention has advantages.

Description

technical field [0001] The invention relates to the field of image fusion, in particular to a method for fusion of infrared and visible light images. Background technique [0002] Image fusion is a technology that combines the imaging information of the same scene by multiple sensors or multiple imaging by a single sensor to obtain more comprehensive, accurate and reliable information. Infrared and visible light image fusion is an important field of image fusion. The active research direction is that the fusion of infrared and visible light images can make full use of the complementary information of the two to obtain more comprehensive and accurate images, which has been widely used in military and civilian fields such as military reconnaissance and security monitoring. [0003] In the field of infrared and visible light image fusion, the fusion method based on wavelet transform is a mainstream method. Among them, discrete wavelet transform (DWT) has the characteristics of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/50
Inventor 何贵青董丹丹夏召强冯晓毅李会方谢红梅吴俊蒋晓悦
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products