Multispectral and panchromatic image fusion method based on dense and jump connection deep convolutional network

A multi-spectral image, panchromatic image technology, applied in the field of multi-spectral and panchromatic image fusion, can solve the problem that high spatial resolution multi-spectral images cannot be accurately generated

Pending Publication Date: 2019-08-30
NORTHWESTERN POLYTECHNICAL UNIV
View PDF21 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to overcome the problem that existing multispectral and panchromatic image fusion methods cannot accurately generate high spatial resolution multi

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multispectral and panchromatic image fusion method based on dense and jump connection deep convolutional network
  • Multispectral and panchromatic image fusion method based on dense and jump connection deep convolutional network
  • Multispectral and panchromatic image fusion method based on dense and jump connection deep convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] Now in conjunction with embodiment, accompanying drawing, the present invention will be further described:

[0075] The present invention provides a multispectral and panchromatic image fusion method based on a dense and skip-connected deep convolutional network. The method is divided into two parts: model training and image fusion. In the model training stage, the original clear multispectral and panchromatic images are first down-sampled to obtain simulated training image pairs; then the features of the simulated multispectral and panchromatic images are extracted, the features are fused using densely connected networks, and high-resolution images are reconstructed using skip connections. Spatial resolution multispectral images; finally, the parameters of the model are adjusted using the Adam algorithm. In the image fusion stage, the features of multispectral and panchromatic images are first extracted, the features are fused using a densely connected network, and co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a multispectral and panchromatic image fusion method based on a dense and jump connection deep convolutional network. The method comprises two parts of model training and image fusion, and is characterized by at the model training stage, firstly, performing the down-sampling on an original clear multispectral image and a panchromatic image to obtain a simulation training image pair; secondly, extracting the characteristics of the simulated multispectral and panchromatic images, fusing the characteristics by utilizing a dense connection network, and reconstructing a high-spatial-resolution multispectral image by utilizing jump connection; and finally, adjusting parameters of the model by using an Adam algorithm; at the image fusion stage, firstly, extracting the features of multispectral and panchromatic images, fusing the features by utilizing a dense connection network, and reconstructing a high-spatial-resolution multispectral image in combination with jump connection, wherein the two feature extraction sub-networks are responsible for extracting the features of the input image pair, and the three dense connection networks are responsible for fusing the features, the jump connection and two transposed convolutions are responsible for reconstructing the high-spatial-resolution multispectral image.

Description

technical field [0001] The invention belongs to the field of remote sensing image processing, and in particular relates to a multispectral and panchromatic image fusion method based on a dense and skip connection deep convolutional network. Background technique [0002] Remote sensing images have two important properties - spectral resolution and spatial resolution. Spectral resolution refers to the minimum wavelength range that the sensor can distinguish when receiving the spectrum of the target radiation. The narrower the wavelength range, the higher the spectral resolution, and the stronger the ability of the sensor to distinguish and identify each band of light in the spectrum, resulting in The more the number of bands, the richer the spectral information of the obtained remote sensing images. Spatial resolution refers to the minimum distance between two adjacent features that can be identified on remote sensing images. The smaller the minimum distance, the higher the s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T5/50
CPCG06T5/50G06T2207/10036G06T2207/10041G06T2207/20081G06T2207/20084G06T2207/20221
Inventor 李映王栋马力白宗文
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products