Unsupervised Hyperspectral Image Blind Fusion Method and System Based on Space-Spectrum Joint Residual Correction Network

A technology for hyperspectral images and multispectral images, applied in the field of unsupervised blind fusion of hyperspectral images, can solve problems such as height discomfort, difficulty in directly obtaining high spatial resolution hyperspectral images, and the accuracy of fusion results deviating from hyperspectral images , to achieve good fusion results and improve accuracy

Active Publication Date: 2022-04-15
NANJING UNIV OF SCI & TECH
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the hyperspectral image fusion based on deep learning still has the following problems: (1) The two parameters of the spatial downsampling operator and the spectral response matrix depend on the imaging device, and many super-resolution methods assume these two operators by simulating the information of the imaging device It is known that the greater the error between the assumed operator and the actual parameter information, the more the accuracy of the fusion result will deviate from the real hyperspectral image; (2) The image obtained by the imaging device often reduces the resolution of one dimension to improve the other. One-dimensional resolution, that is, it is difficult to directly obtain high-spatial-resolution hyperspectral images, so building a supervised training model does not meet actual needs; (3) Unknown spatial degradation operators and spectral response matrices are unknown. Supervised fusion is a highly ill-posed problem, because unsupervised training requires that the loss function does not use real high-spatial-resolution hyperspectral images, and can only evaluate image quality indirectly, such as using low-resolution images after output degradation and known How to use limited low-resolution data to train an unsupervised blind fusion network and design an effective loss function is a difficulty

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unsupervised Hyperspectral Image Blind Fusion Method and System Based on Space-Spectrum Joint Residual Correction Network
  • Unsupervised Hyperspectral Image Blind Fusion Method and System Based on Space-Spectrum Joint Residual Correction Network
  • Unsupervised Hyperspectral Image Blind Fusion Method and System Based on Space-Spectrum Joint Residual Correction Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to enable those skilled in the art to better understand the solutions of the present invention, the following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is an embodiment of a part of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0021] Such as figure 1 As shown, an unsupervised hyperspectral image blind fusion method based on a space-spectrum joint residual correction network of the present invention includes the following steps:

[0022] (1) Establish spatial degradation and spectral degradation models based on hyperspectral data:

[0023] Step 1. Obtain...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an unsupervised hyperspectral image blind fusion method and system based on a spatial-spectral joint residual correction network. The method includes: establishing a degraded network structure of a hyperspectral image to simulate the process of spatial and spectral downsampling; establishing a spatial and the spectral residual fusion network model, using the difference between the low-resolution results obtained by the degradation model and the training data as the input of the fusion network, that is, the residuals in the spatial and spectral dimensions are fused to obtain a residual map corresponding to the input data ; Correct the initialization data, and then send the corrected result to the degradation network and the space-spectrum joint correction network for multiple iterations to improve the accuracy of the fusion result. The present invention uses a space-spectrum joint correction network suitable for blind fusion of unsupervised hyperspectral images, and the space-spectrum joint correction network can obtain an error map between an input hyperspectral image and a real value.

Description

technical field [0001] The invention belongs to the technical field of remote sensing image processing, and in particular relates to an unsupervised hyperspectral image blind fusion method and system based on a space-spectrum joint residual correction network. Background technique [0002] Hyperspectral image fusion is an important application direction in the field of hyperspectral image remote sensing. Hyperspectral image fusion is to use the rich spectral information in low spatial resolution hyperspectral images and the rich spatial information in high spatial resolution multispectral images to synthesize high spatial resolution hyperspectral image data, which can be used for subsequent more complex images. Processing tasks provide high-quality training sets. Due to the limitations of existing sensor hardware, it is difficult to directly acquire images with both high spatial resolution and high spectral resolution. Therefore, the collected data can be post-processed thr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V20/13G06V10/774G06V10/80G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/04G06N3/088G06F18/214G06F18/253
Inventor 徐洋王婷婷吴泽彬韦志辉
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products