Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image fusion method based on local contrast preprocessing

A technology of local contrast and image fusion, which is applied in the field of image fusion to achieve the effect of avoiding spatial discontinuity

Pending Publication Date: 2021-11-02
UNIV OF SCI & TECH OF CHINA
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The purpose of the present invention is to overcome the preprocessing deficiencies of the existing standard sparse representation image fusion method, and propose an image fusion method based on local contrast preprocessing, in order to improve the ability of the sparse representation image fusion method for detail extraction, At the same time, spatial correlation and low-frequency information are kept as much as possible without loss, thereby improving the fusion results of visual effects and objective evaluation indicators, and providing a new idea for preprocessing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image fusion method based on local contrast preprocessing
  • Image fusion method based on local contrast preprocessing
  • Image fusion method based on local contrast preprocessing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] In this embodiment, a sparse representation image fusion method based on a local contrast preprocessing method is applied in such as figure 2 The source image shown is the case of two, and when the source image is more than two, it can also be deduced by analogy. Specifically, refer to figure 1 , the method includes the following steps:

[0039] Step 1, preprocessing stage:

[0040] Step 1.1. Obtain two source images I of size M×N of the same scene after registration A , I B ∈ R M×N , where R M ×N Represents a matrix with M rows and N columns, using a sliding window to combine two source images I A and I B are divided into a series of The image block; thus correspondingly get K segmented image blocks, denoted as and in, represents the source image I A The i-th image block after segmentation, represents the source image I B The i-th image block after segmentation, m represents the size of the dictionary atom;

[0041] Step 1.2, respectively calcula...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image fusion method based on local contrast preprocessing, and the method comprises the steps: 1, a preprocessing stage: carrying out the preprocessing of a source image based on the local contrast, and obtaining a corresponding image block group and the local contrast in a vector form corresponding to the image block; 2, a fusion stage: solving a sparse coefficient of a local contrast by using a matching pursuit algorithm, taking an l1 norm of the sparse coefficient as an information activity degree of a corresponding image block, and fusing the source image blocks according to a fusion rule selected by a maximum value to obtain a fused image block group; 3, a reconstruction stage: using an anti-sliding window and an overlapped pixel average weighting method for the fused image block group to obtain a fused image. According to the method, the detail extraction capability of an image fusion method based on sparse representation can be improved, and meanwhile, spatial correlation and low-frequency information are kept as far as possible and are not lost, so a fusion result on a visual effect and an objective evaluation index is improved, and a new thought is provided for preprocessing.

Description

technical field [0001] The invention belongs to the technical field of image fusion, and in particular relates to a method for extracting local contrast of regions from image blocks as a sparse representation preprocessing method, which is mainly applied to image fusion algorithms based on sparse representation. Background technique [0002] Image fusion technology is an information fusion technology that uses image data from different sensors for the same scene as algorithm input, and uses specific algorithms to extract and organically combine the feature information and complementary information of these data to obtain a fused image. Thanks to the redundant and learnable image feature dictionary, compared with multi-scale transformation, sparse representation can approximate the signal more comprehensively and effectively, so the fusion method based on sparse representation is an important class of image fusion technology. [0003] The image fusion method based on sparse r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/50G06T5/00
CPCG06T5/50G06T2207/10004G06T5/70
Inventor 陈勋陈宇航刘爱萍谢洪涛张勇东吴枫
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products