A remote sensing image fusion method based on a dual-channel neural network

A remote sensing image fusion and neural network technology, applied in the field of remote sensing image fusion based on dual-channel neural network, can solve the problems of low fusion efficiency and low quality of remote sensing image fusion, and achieve improved preservation performance, accurate reconstruction, and comprehensive fusion Effect

Active Publication Date: 2019-06-14
NORTHWEST UNIV
View PDF8 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to make full use of the advantages of deep learning, as well as the structural characteristics and correlations existing within and between images, to improve the accuracy of image fusion, the present invention proposes a The remote sensing image fusion method (DCCNN) based on dual-channel neural network adopts the idea of ​​ARSIS, extracts the spatial detai

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A remote sensing image fusion method based on a dual-channel neural network
  • A remote sensing image fusion method based on a dual-channel neural network
  • A remote sensing image fusion method based on a dual-channel neural network

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0064] Example

[0065] The present invention uses two types of satellite remote sensing images to verify the effectiveness of the proposed fusion algorithm; the spatial resolution of the panchromatic image and multispectral image that can be captured by the IKONOS satellite is 1 meter and 4 meters respectively; the panchromatic image provided by the QuickBird satellite The spatial resolutions of the multi-spectral and multi-spectral images are 0.7 m and 2.8 m, respectively; among them, the multi-spectral images obtained by the two satellites include four bands of red, green, blue and near infrared; the size of the panchromatic image used in the experiment is 256× 256, the size of the multispectral image is 64×64.

[0066] In order to better evaluate the practicability of the fusion method, the present invention provides two types of experiments, which are simulated image experiment and actual image experiment. The simulated image experiment reduces the spatial resolution of panchr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a remote sensing image fusion method based on a dual-channel neural network, and the method comprises the steps of firstly, extracting the spatial detail information in a high-frequency component after the high-pass filtering of a panchromatic image through the joint learning of a dual-channel network by utilizing the ARSIS thought; then injecting spatial detail informationinto each waveband image of the multispectral image by using a detail injection model to obtain a required high-resolution multispectral image. According to the method, the advantages of deep learning are effectively utilized, the spatial detail information, different from multi-spectral image waveband images, of a panchromatic image is obtained through double-channel network joint training, andthe correlation between the interior of the image and the images is fully utilized, so that the detail reconstruction is more accurate. Meanwhile, the number of injected details is effectively controlled by using the detail injection model, so that the spatial information storage performance of the fusion algorithm is remarkably improved, and the spectral characteristics of the original multispectral image are well kept.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a remote sensing image fusion method based on a dual-channel neural network. Background technique [0002] With the development of remote sensing technology, remote sensing image data obtained by various satellite sensors provides abundant resources for human observation of the earth, and has made important contributions to a deeper understanding of the world. For remote sensing images, their spatial and spectral resolutions contradict each other. Panchromatic images have high spatial resolution, but less spectral information, and cannot display the color of ground features; multispectral images have rich spectral information, but lower spatial resolution. By fusing panchromatic images with multispectral images, multispectral images with both high spatial and high spectral resolutions will be obtained to meet human needs. [0003] At present, remote sensing...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T3/40
Inventor 彭进业刘璐王珺阎昆吴振国章勇勤张二磊罗迒哉祝轩李展艾娜
Owner NORTHWEST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products