Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing image fusion method of multi-scale attention deep convolutional network based on 3D convolution

A deep convolution and attention technology, applied in the information field, can solve the problems of poor fusion quality and fusion effect, incomplete remote sensing image fusion, etc.

Active Publication Date: 2021-05-18
NORTHWEST UNIV
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to reduce the workload of image processing and improve the accuracy of image fusion by making full use of the correlation between each pixel and each band of the multispectral image and the high spatial resolution of the panchromatic image, the purpose of the present invention is to: Provides a remote sensing image fusion method based on deep learning 3D multi-scale attention deep convolutional network (MSAC-Net), adopts 3D convolution method, uses deep learning model while preserving the spectral details of multispectral images, using attention The force mechanism extracts spatial details from panchromatic images, and learns the final result with multiple intermediate-scale results to obtain the required high-resolution multispectral image, to solve the problem of incomplete fusion of remote sensing images and the quality of fusion in the existing technology. and the problem of poor fusion effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing image fusion method of multi-scale attention deep convolutional network based on 3D convolution
  • Remote sensing image fusion method of multi-scale attention deep convolutional network based on 3D convolution
  • Remote sensing image fusion method of multi-scale attention deep convolutional network based on 3D convolution

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0107] In this example, two kinds of satellite remote sensing images are used to verify the effectiveness of the proposed fusion algorithm; the spatial resolutions of panchromatic images and multispectral images captured by IKONOS satellites are 1 meter and 4 meters respectively; panchromatic images provided by QuickBird satellites The spatial resolutions of the image and the multispectral image are 0.7 meters and 2.8 meters respectively; among them, the multispectral images acquired by the two satellites include four bands of red, green, blue and near-infrared; the size of the panchromatic image used in the experiment is 256 ×256, and the multispectral image size is 64×64.

[0108] In order to better evaluate the practicability of the remote sensing image fusion method (MASC-Net) of the multi-scale attention depth convolution network based on 3D convolution in this embodiment, this embodiment provides two types of experiments, which are respectively simulated image The experi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a remote sensing image fusion method of a multi-scale attention deep convolutional network based on 3D convolution, and the method comprises the steps of fusing the high spectral resolution of a multispectral image with the high spatial resolution of a panchromatic image, so as to obtain a multispectral image with the high spatial resolution and the high spectral resolution. A 3D multi-scale attention deep convolutional network model (MSAC-Net) is designed by using a U-Net network structure framework in deep learning. In order to retain the spectral resolution in multiple spectrums, the whole model uses 3D convolution, and feature extraction is carried out on information in the spectral dimension; in order to capture more spatial details, an attention mechanism is introduced at the jump connection of the model to learn regional details. In the decoding stage of the model, a plurality of reconstruction layers containing multi-scale space information are introduced to calculate a reconstruction result, the model is encouraged to learn multi-scale representation of different levels, and multi-level reference is provided for a final fusion result. And the fusion result of the final image is effectively improved.

Description

technical field [0001] The invention belongs to the field of information technology and relates to image processing technology, in particular to a remote sensing image fusion method based on a 3D convolution multi-scale attention depth convolution network. Background technique [0002] Remote sensing satellites can acquire panchromatic (PAN) images of the same scene while taking multispectral (MS) images. Among them, multispectral images are rich in spectral information, but have low spatial resolution and poor clarity, while panchromatic images has high spatial resolution but low spectral resolution; the spatial and spectral resolutions of the two contradict each other. Combining the advantages of the two to obtain a multispectral image with high spatial and spectral resolution is currently a great demand. [0003] At present, deep learning has been widely used in various research fields, providing a new solution to various fields. Among them, in the field of deep learnin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/50G06T5/00G06T3/40G06N3/08G06N3/04
CPCG06T5/50G06T3/4023G06T3/4053G06N3/08G06T2207/10036G06T2207/20221G06N3/045G06T5/70
Inventor 彭进业付毅豪张二磊王珺刘璐俞凯祝轩赵万青何林青
Owner NORTHWEST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products