Multi-scale remote sensing image fusion method based on convolution neural network

A convolutional neural network and remote sensing image fusion technology, applied in the field of remote sensing image fusion, can solve problems such as spectral distortion of fusion images, lack of consideration of local differences between Pan images and MS images, weak universality, etc., and achieve good fusion effect and fusion The effect of short calculation times

Active Publication Date: 2019-01-25
JILIN UNIV
View PDF7 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] IHS, PCA, and Brovey can usually preserve the spatial information of the Pan image in the fusion image well, and the implementation is simple, but they do not consider the local differences between the Pan image and the MS image, making the final fusion image have significant spectral distortion.
In multi-scale analysis, the series of image decomposition and the filters used will have a great impact on the fusion results. Typical multi-scale analysis methods such as wavelet transform have obvious spatial information distortion in the fusion results.
Although the hybrid method combines CS and MRA, the final fusion image still has different degrees of spectral distortion and spatial structure distortion, and the fusion effect is closely related to the specific fusion method selected.
[0005] The core algorithms of the above fusion methods are basically based on artificially formulated fusion rules, and the final fusion effect is also changed by the fusion rules, and these fusion rules have a strong dependence on the fusion image itself. The same method can be used on different remote sensing images. The fusion effects obtained are not the same, and the universality is weak

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-scale remote sensing image fusion method based on convolution neural network
  • Multi-scale remote sensing image fusion method based on convolution neural network
  • Multi-scale remote sensing image fusion method based on convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be described in detail below in conjunction with the accompanying drawings, so that those skilled in the art can better understand the present invention. It should be pointed out that those skilled in the art can make some improvements to the present invention without departing from the core idea of ​​the present invention, and these all fall within the protection scope of the present invention.

[0030] Such as figure 1 As shown, the present invention provides a multi-scale remote sensing image fusion method based on convolutional neural network, including the following steps:

[0031] Step one is to build a suitable multi-scale convolutional neural network fusion model that meets the characteristics of remote sensing image fusion with the help of the nature of the convolutional neural network. The input is the image to be fused and the output is the fused image. The model construction process is:

[0032] First, the Pan and MS images are convolved ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a multi-scale remote sensing image fusion method based on a convolution neural network. The method comprises the following steps: firstly, a multi-scale convolution neural network fusion model conforming to the fusion characteristics of remote sensing images is constructed by utilizing the properties of the convolution neural network, wherein an image to be fused is input, and a fused image is putput; Secondly, a suitable training dataset is constructed and the fusion model is successfully trained on the dataset, thirdly, the panchromatic image Pan image is converted tothe image to be fused which is needed by the model, Fourthly, the converted approximate Pan image and multi-spectral image MS are input into the trained fusion model, and the final fusion image is obtained. The method of the invention learns an adaptive multi-scale fusion function from a large amount of data, which is not designed artificially and is more reasonable by statistical learning. The experimental results show that the multi-scale fusion method based on convolution neural network can process the remote sensing images of different satellites and different bands.

Description

Technical field [0001] The present invention belongs to the field of remote sensing image fusion. Specifically, it designs a fusion method capable of fusing pan-color image Pan image and multi-spectral image MS image together so that the fused image has both hyperspectral and high spatial resolution. Background technique [0002] In recent years, remote sensing images have been widely used in various applications, such as environmental management and detection, geological hazard prevention, precision agriculture, national defense security, and so on. Due to the limitation of satellite sensors, we can only obtain high spectral resolution multispectral image MS and high spatial resolution panchromatic image Pan. But in actual applications, we need to use both hyperspectral and high spatial resolution information at the same time. High spectral resolution is used for accurate feature classification, and high spatial resolution is used for description of feature shape and texture. T...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/251
Inventor 张小利李雄飞叶发杰于爽王婧骆实朱芮
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products