Multi-strategy image fusion method under compressed sensing framework

An image fusion and compressed sensing technology, applied in the field of image processing, can solve the problems of large image storage space, long image fusion process, unfavorable image compression and transmission, etc., to reduce the amount of fusion data, improve the fusion effect, shorten the The effect of fusion time

Inactive Publication Date: 2011-06-15
XIDIAN UNIV
View PDF1 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] (1) The amount of data in the fusion image is large, which leads to a large space for image storage, which is not conducive to image compression and transmission;
[0006] (2) The computational complexity of data in image fusion is high, which makes the image fusion process take a long time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-strategy image fusion method under compressed sensing framework
  • Multi-strategy image fusion method under compressed sensing framework
  • Multi-strategy image fusion method under compressed sensing framework

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] refer to figure 1 , the specific implementation process of the present invention is as follows:

[0038] Step 1, input the original image A and original image B, divide the original image A and original image B into partial images X1 and X2 of size C×C, C×C is 8×8 or 16×16, and this example takes 16 ×16.

[0039] Step 2: Perform Fourier transform on the local image X1 to obtain a Fourier coefficient matrix y1, and perform Fourier transform on the local image X2 to obtain a Fourier coefficient matrix y2.

[0040] Step 3, using the variable-density observation model of low-frequency full sampling of Fourier coefficients, observe the Fourier coefficient matrix y1 to obtain the observation vector f1.

[0041] (3a) Set the sampling model to be a matrix whose value is only 0 or 1, and use the point with a value of 1 as the sampling point, and set the matrix B according to the size of the input image A: if the size of the input image A is m×m, Then suppose the size of matri...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-strategy image fusion method under a compressed sensing framework, mainly solving the problems of large calculated amount, high time complexity and large storage space of the traditional image fusion method. The multi-strategy image fusion method comprises the following implementation processes: inputting original images A and B and dividing the original images A and B into local images X1 and X2 of C*C in size; respectively carrying out Fourier transformation on X1 an X2 to obtain coefficient matrixes y1 and y2; observing y1 and y2 respectively by adopting a Fourier coefficient low-frequency full variable-density observing model to obtain observation vectors f1 and f2; calculating harmonic coefficients H1 and H2 and frequency-spectrum matching degree S according to f1 and f2; selecting a threshold T and calculating a weighting coefficient; comparing the weighting coefficient, the threshold and the frequency-spectrum matching degree to calculate a fused observation vector f; and iterating the observation vector f for twenty times by using a Split Bregman reconfiguration algorithm to finally obtain a required fused image. Compared with the traditional fusion method, the multi-strategy image fusion method provided by the invention has the advantages of low calculation complexity and good fusion effect, and can be used for video tracking, target recognition and computer vision.

Description

technical field [0001] The invention belongs to the technical field of image processing, in particular to image fusion, which can be used for video tracking, target recognition and computer vision. Background technique [0002] Image fusion is an information processing technology that processes multiple images to obtain an improved new image. Image fusion technology is a comprehensive image information processing technology that studies how to process and collaboratively utilize multiple images, and make different image information complement each other to obtain a more objective and essential understanding of the same thing or target. Due to the limited focus range of the visible light imaging system and the differences in sensors, in the same scene, a well-focused object can present a clear image, and all objects at a certain distance in front of and behind the object will present varying degrees of blur. An image where all targets are clear; due to differences in image r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50
Inventor 刘芳焦李成王爽刘子僖戚玉涛侯彪马文萍尚荣华郝红侠朱亚萍
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products