Distribution transform-based multi-sensor image fusion method

An image fusion and multi-sensor technology, applied in image enhancement, image data processing, instruments, etc., can solve problems such as difficulty in establishing a unified data distribution model, lack of data, etc., to avoid overfitting effects, improve instability, and improve classification accuracy Effect

Inactive Publication Date: 2011-04-13
HARBIN INST OF TECH
View PDF0 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to solve the problem that it is difficult to establish a unified data distribution model in the existing multi-source data fusion process, the present invention lacks a suitable method that can effectively analyze the joint characteristics of different sources, and proposes a multi-sensor image fusion method based on distribution transformation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distribution transform-based multi-sensor image fusion method
  • Distribution transform-based multi-sensor image fusion method
  • Distribution transform-based multi-sensor image fusion method

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0031] Specific implementation mode one: this implementation mode is realized by the following several steps:

[0032] Step 1: Transform the data format of the SAR image and the multispectral image file to be fused, so that the gray value of each image to be fused is converted into a corresponding vector form;

[0033] Step 2: Analyze the characteristics of each converted image data to be fused, and respectively establish a PDF model of the synthetic aperture radar image and a PDF model of the multispectral image;

[0034] Step 3: Apply the distribution transformation theory, and establish a joint probability density function model of multi-source data according to the correlation between multiple sources;

[0035] Step 4: According to the scale parameters and shape parameters in the joint probability density model obtained in step 3, perform estimation operations;

[0036] Step 5: Substituting the parameters obtained in Step 4 into the joint probability density model of Step...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a distribution transform-based multi-sensor image fusion method, and relates to a multi-sensor image fusion method. The method solves the problems that the conventional multi-source data fusion process is difficult to establish a unified data distribution model and a proper method capable of effectively analyzing the joint properties of different sources is lacking. The method comprises the following steps of: 1, converting a data format so that the gray value of an image to be fused is converted into a vector form; 2, analyzing image data property, and establishing a probability distribution function (PDF) model; 3, establishing a joint probability density function model of multi-source data; 4, estimating and operating; 5, performing classified calculation based on the data fusion result of distribution transform through Bayesian criteria; and 6, performing compensating calculation on individual calculation results with low classification precision by adopting a compensating algorithm. The method establishes the unified data distribution model in the multi-source data fusion process, and effectively analyzes and fuses the joint properties of different sources.

Description

technical field [0001] The invention relates to a multi-sensor image fusion method. Background technique [0002] With the development of remote sensing technology, various remote sensing image data (multi-temporal, Multi-spectral, multi-sensor, multi-platform and multi-resolution) are increasing, and the remote sensing image data provided by various sensors have their own characteristics. At present, the development of SAR has been widely concerned by people. It can penetrate clouds and rain areas, and has all-weather and all-weather working characteristics. It can penetrate deep into vegetation, and change the wavelength of SAR to obtain information on the upper layer of vegetation and even underground. The image information obtained by SAR depends on the dielectric properties and geometric properties of the object, and has higher resolution and more texture information, but the description of the structure edge of the object is not clear, and there are often more clutter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50
Inventor 张钧萍孙毓赵宏磊张晔
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products