Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote Sensing Image Fusion Method Based on Sparse Tensor Nearest Neighbor Embedding

A remote sensing image fusion and sparse technology, applied in the field of remote sensing image fusion, can solve the problems of failing to fully consider the full utilization of information, ignoring the application of high-order statistical characteristics, and spectral distortion of fusion results, so as to improve spectral distortion and color distortion, The effect of enhancing spatial resolution and improving performance

Active Publication Date: 2019-06-21
XIDIAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The fusion algorithm based on the model Model_Based still has deficiencies in maintaining spectral information: 1) ignoring the connection between bands leads to spectral distortion of the fusion results; 2) ignoring the application of high-order statistical properties
Because the algorithm fuses the multispectral image into different bands, it fails to fully consider the full use of the information between the bands, resulting in color distortion and spectral distortion in the fusion results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote Sensing Image Fusion Method Based on Sparse Tensor Nearest Neighbor Embedding
  • Remote Sensing Image Fusion Method Based on Sparse Tensor Nearest Neighbor Embedding
  • Remote Sensing Image Fusion Method Based on Sparse Tensor Nearest Neighbor Embedding

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0041] refer to figure 1 , the specific embodiment of the present invention is as follows:

[0042] Step 1, respectively input the low-resolution multispectral image M and the high-resolution panchromatic image P,

[0043] (1.1) The size of the low-resolution multispectral image M input in the embodiment of the present invention is 64×64×4, and the resolution is 2m; the size of the high-resolution panchromatic image P is 256×256, and the resolution is 0.5m;

[0044] (1.2) Upsampling the low-resolution multispectral image M to the same resolution size as the high-resolution panchromatic image P to obtain an upsampled multispectral image M1. In this example, the size of M1 is 256×256×4.

[0045] Step 2: Input a high-resolution panchromatic image P to obtain a downsampled panchromatic image P1 and an upsampled panchromatic image P2.

[0046] (2.1) Input a high-resolution panchromatic image P, and downsample it to obtain a downsampled panchromatic image P1 with a size of 64×64;

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sparse tensor neighborhood embedding-based remote sensing image fusion method. According to the technical scheme of the invention, the method comprises the following steps of 1), inputting a low-resolution and multi-spectral image M and a high-resolution and full-color image P; 2) dividing the low-resolution and multi-spectral image M into multi-spectral image tensor blocks M1; 3) constructing a high-resolution and multi-mode dictionary HD and a low-resolution and multi-mode dictionary LD based on the high-resolution and full-color image P and the multi-spectral image tensor blocks M1; 4) optimizing the atoms of the high-resolution and multi-mode dictionary HD and the atoms of the low-resolution and multi-mode dictionary LD; 5) solving the tensor sparse coefficient A of the multi-spectral image tensor blocks M1 in the low-resolution and multi-mode dictionary LD; 6) multiplying the tensor sparse coefficient A by the high-resolution and multi-mode dictionary HD, and obtaining a high-resolution and multi-spectral image after the airspace residual compensation. In this way, the information within multi-spectral bands is utilized, and the color distortion of a fused image is reduced. Therefore, the method can be used for remote sensing and target identification.

Description

technical field [0001] The invention belongs to the technical field of image processing, in particular to a remote sensing image fusion method, which can be used for remote sensing detection, safety navigation, medical image analysis, anti-terrorism inspection and environmental protection. Background technique [0002] Remote sensing image fusion is the process of processing image data and other information obtained by multiple remote sensors. One of the most widely studied is the fusion of multispectral images and panchromatic images. Multi-spectral images have rich color information but low spatial resolution, generally four bands. Panchromatic images have high spatial resolution and clear details, but lack color information. Generally, they use a single band from 0.5 μm to 0.75 μm. In target recognition, both color information and target spatial resolution play a very important role in distinguishing ground features. [0003] Remote sensing image fusion is the process ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/50
CPCG06T5/50G06T2207/10032G06T2207/20221
Inventor 杨淑媛焦李成苏晓萌李红刘红英马晶晶刘芳侯彪马文萍张凯邢颖慧李倩兰
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products