Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-hyperspectral image fusion method guided by low-rank prior and spatial spectrum information

A multi-spectral image and image fusion technology, applied in neural learning methods, image enhancement, image data processing, etc., can solve the problems of easily ignoring the prior characteristics of hyperspectral images, lack of physical interpretability, lack of space-spectral guidance, etc. Avoid mutual interference, reduce spatial spectral distortion, and improve the effect of fusion accuracy

Pending Publication Date: 2022-08-05
WUHAN UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In the fusion method based on deep learning, hyperspectral images and multispectral images are usually superimposed as network input, which is difficult to preserve well, extracting the respective characteristics of multispectral and hyperspectral images, which increases the difficulty of fusion
At the same time, it lacks effective space-spectrum guidance and has serious space-spectrum distortion
Furthermore, the entire fusion process lacks physical interpretation, and it is easy to ignore the inherent prior characteristics of hyperspectral images, making the fusion images not necessarily meet the needs of real applications.
[0009] It can be seen that an ideal hyperspectral image and multispectral image fusion method has not yet appeared.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-hyperspectral image fusion method guided by low-rank prior and spatial spectrum information
  • Multi-hyperspectral image fusion method guided by low-rank prior and spatial spectrum information
  • Multi-hyperspectral image fusion method guided by low-rank prior and spatial spectrum information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0081] Step 1: Perform Gaussian filtering and downsampling on the given hyperspectral image and multispectral image to generate the input during network training, and the original hyperspectral image is used as the target image to calculate the loss function.

[0082] In the present invention, when Gaussian filtering is performed on the original hyperspectral image, the filter kernel size is 5×5, and the standard deviation is 2. After filtering, downsampling is performed, the downsampling rate is 3, and the sampling method is bilinear sampling.

[0083] The size of the multispectral image in the embodiment is 2187×2187, including 4 bands, and the size of the hyperspectral image is 729×729, including 150 bands. In the specific implementation, the multispectral image and the hyperspectral image are subjected to Gaussian filtering with a filter kernel size of 5×5 and a standard deviation of 2. After filtering, bilinear downsampling with a downsampling rate of 3 is performed, and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-hyperspectral image fusion method guided by low-rank prior and spatial spectrum information, and provides a brand new multi-layer multi-branch fusion network SSLRNet combining spatial spectrum guidance and low-rank prior, the network firstly constructs a multi-layer multi-branch fusion sub-network (MLMB), and aims to extract features from a plurality of branches, and then the multi-layer multi-branch fusion sub-network SSLRNet constructs a multi-layer multi-branch fusion sub-network SSLRNet; and multi-layer feature fusion is carried out to reconstruct a preliminary fusion image. And then, constructing a fusion image spatial spectrum correction sub-network based on spatial spectrum guidance, and performing spatial spectrum guidance on a preliminary fusion image generated by the MLMB through a multispectral image waveband superposition summation image and a hyperspectral image waveband average value image, thereby reducing spatial spectrum distortion. And finally, constructing a fusion image low-rank prior constraint sub-network based on a low-rank neural network, combining the sub-network with a deep learning network, and performing low-rank decomposition by utilizing the characteristics of the network, so that a fusion result better meets a real application requirement. According to the invention, the fusion precision of the network is improved, and real application requirements are better met.

Description

technical field [0001] The invention relates to the field of hyperspectral image and multispectral image fusion, in particular to a technical method for reducing spatial distortion and spectral distortion generated in the fusion process through spatial spectrum guidance, and embedding low-rank priors in a neural network to enable The fusion image is a technical method that is more in line with the needs of practical applications. It completes the mapping of low-resolution images to high-resolution images in a data-driven manner, and realizes the effective fusion of hyperspectral images and multispectral images. Background technique [0002] Hyperspectral images usually have dozens or hundreds of narrow spectral range bands, which means they have high spectral resolution. Using this feature, very delicate spectral curves can be simulated, and different types of ground objects and features can be identified. , so it is widely used in a series of tasks such as image classificat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/50G06V10/58G06V10/77G06V10/80G06V10/82G06T3/40G06N3/04G06N3/08
CPCG06T5/50G06T3/4053G06V10/806G06V10/7715G06V10/82G06V10/58G06N3/08G06T2207/10036G06T2207/20081G06T2207/20084G06T2207/20221G06N3/045Y02A40/10
Inventor 张洪艳王文高曹伟男杨光义张良培
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products