Image fusion method based on multi-scale dictionary learning

A multi-scale dictionary and image fusion technology, applied in the field of image fusion, can solve the problem of not being able to analyze data at multiple scales

Inactive Publication Date: 2013-01-02
NORTHWESTERN POLYTECHNICAL UNIV
View PDF2 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] As mentioned above, the current image fusion methods based on dictionary learning are all dictionary learning in a single s

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image fusion method based on multi-scale dictionary learning
  • Image fusion method based on multi-scale dictionary learning
  • Image fusion method based on multi-scale dictionary learning

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0063] Example 1. Fusion example of non-defaced image

[0064] In this example, infrared visible image pairs and remote sensing image pairs are fused separately, and there is no standard fusion result. The implementation steps of the multi-scale dictionary learning and fusion process in this example are as follows:

[0065] Multi-scale dictionary learning is shown below :

[0066] (1) Decompose each training image with 3-level db4 wavelet transform, and the training data adopts the source fusion image itself;

[0067] (2) Initialize all sub-dictionaries D b ∈R 64×256 ,b=1,2,...,10(3×3+1);

[0068] (3) For all subbands, the sliding window with a step size of 1 and a size of 8×8 extracts blocks in order from upper left to lower right, then straightens the blocks and arranges them in turn to form a matrix, and each subband is arranged as a matrix;

[0069] (4) Use the K-SVD algorithm to learn a sub-dictionary D for each matrix b , the K-SVD algorithm allows the error ε t...

example 2

[0086] Example 2. Fusion example of noisy image

[0087] In practical applications, source images containing noise are often obtained, so how to suppress the influence of noise on the fusion result in the process of fused images is also an important aspect that needs to be considered in the current fusion method. In the past, the denoising and fusion processing of most image fusion methods were carried out separately, but in recent years, image fusion methods based on sparse representation have shown excellent performance in completing denoising while merging, so in this example, the present invention is further described The fusion method's ability to suppress noise.

[0088] The image in this example is a multi-focus image with standard fusion results. Figure 4(a) and (b) are noise-free near-focus and far-focus images, and (c) is the standard fusion result, both of which are 256×256 in size. Add Gaussian white noise with mean value 0 and variance σ(σ=5,10,15,20,25) to Figur...

example 3

[0095] Example 3. Generalization example of dictionary

[0096] The examples in the previous two sections discussed the fusion results of several methods in unstained images and noise images. It is worth noting that the training data of the dictionary comes from the source image itself. In fact, since each atom of the dictionary represents a structural prototype of the source image, the dictionary trained for one type of image can be directly applied to another set of images of the same type, so that there is no need to retrain the dictionary during fusion. Based on this First, this subsection discusses the generalization ability of dictionaries in image fusion.

[0097] The multi-scale dictionary learning part of this example does not re-learn the multi-scale dictionary, but directly uses the multi-scale dictionary of remote sensing images trained in Example 1 for the fusion of the other two sets of remote sensing images; the steps and parameter settings of the fusion process...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an image fusion method based on multi-scale dictionary learning. The image fusion method comprises the steps of: carrying out multiscale learning to decompose each training dictionary into S subbands each of which correspondingly learns a subdictionary; and then carrying out wavelet transformation for a source image to obtain subbands of all resource images, solving sparse representation coefficients of the subbands according to the SOMP (Space Oblique Mercator Projection) algorithm and carrying out fusion, and finally carrying out inverse wavelet transformation to obtain fused images. With the method, the sparseness and the fitting degree of the image representation coefficient are improved, the detail information of the fused image is enhanced, and the fused image has an excellent fusion effect and better noise suppression ability; and the generalization ability of the dictionaries is enhanced.

Description

technical field [0001] The invention relates to an image fusion method. Background technique [0002] Image fusion refers to the technology of combining different imaging of the same object through multiple sensors or multiple imaging of a single sensor through a specific method to obtain a more comprehensive and accurate description. , security monitoring and other fields have broad application prospects. [0003] Since the image fusion method based on sparse representation shows a better fusion effect than the classic fusion methods based on wavelet, curvelet, non-subsampled contourlet, etc., it has become a very active research direction in the field of image fusion. [0004] Document 1 "Yang Guang, Xu Xing-zhong, and Man Hong, Optimum image fusion via sparse representation[C], WOCC 2011-20th Annual Wireless and Optical Communications Conference, 2011." and Document 2 "Yu Nan-nan, Qiu Tian -shuang, and Bi Feng, Image features extraction and fusion based on joint sparse ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T5/50G06K9/66
Inventor 彭进业王珺何贵青阎昆夏召强冯晓毅蒋晓悦吴俊李会方谢红梅杨雨奇
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products