A Hyperspectral Image Classification Method Extracted from Local to Global Context Information

An information extraction and image classification technology, applied in the field of remote sensing image processing, can solve problems such as time-consuming, high computational cost, and difficulty in determining the size of spatial blocks, and achieve the effect of improving isolated and misclassified areas.

Active Publication Date: 2022-06-07
WUHAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method still faces the following problems: (1) only local spatial information can be used, which will lead to misclassified isolated regions in the classification results; (2) the optimal spatial block size is difficult to determine, and its Determined by the spatial resolution and the homogeneity of the distribution of ground features
(3) The calculation cost is high. When predicting the classification map, this method needs to take spatial blocks pixel by pixel for prediction. For some relatively large images, it takes a very long time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Hyperspectral Image Classification Method Extracted from Local to Global Context Information
  • A Hyperspectral Image Classification Method Extracted from Local to Global Context Information
  • A Hyperspectral Image Classification Method Extracted from Local to Global Context Information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0057] The present invention provides a hyperspectral image classification method for extracting information from local to global context, comprising the following steps:

[0058] Step 1, enter an image to be classified, WHU-Hi-HongHu, such as Figure 1 as shown, mirror its spatial dimensions to be filled in multiples of 8.

[0059] Step 2, perform channel dimensionality reduction on the image-filled image, which further includes:

[0060] The image X after the image is filled with images is reduced by a network structure of "convolutional layer-group normalization layer-nonlinear activation layer", and the characteristic map F is output, where group normalization considers the spectral continuity of hyperspectral imagery.

[0061] Step 3.Use the Local Attention Module for local contextual information extraction, such as Figure 2 as shown. This step further includes:

[0062] Step 3.1, after the channel in step 2 is reduced, the feature map F is obtained, and the feature map F is inp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Aiming at hyperspectral images, the present invention discloses a deep learning hyperspectral classification method based on context information extraction based on a local-global attention mechanism, which mainly includes: a full-volume classification framework using encoding and decoding, which can take into account the global spatial spectrum at the same time Information, to achieve rapid classification; imitating the human visual perception mechanism, design a local to global context information perception network architecture in the encoder module of the network, to achieve advanced semantic feature extraction taking into account context information; in the decoder module of the network, design a A module of channel attention to achieve adaptive fusion of local-global information. The present invention can be applied to fine classification of hyperspectral images with massive high-dimensional nonlinear data structures, and greatly improves the phenomenon of misclassified isolated regions in hyperspectral image classification results. Requirements for real-time, rapid and fine-grained classification and mapping.

Description

Technical field [0001] The present invention belongs to the field of remote sensing image processing technology, particularly relates to a deep learning hyperspectral classification method based on the extraction of contextual information from local to global attention mechanisms. Background [0002] Classification has always been an important research area in hyperspectral remote sensing image processing and application, and its rich spectral information can accurately identify the attribute categories of features. At present, with the development of hyperspectral imaging technology, hyperspectral observation platforms such as spaceborne, airborne, and unmanned aerial vehicles provide rich data sources for fine identification of hyperspectral features. However, the high correlation between hyperspectral image bands, the high degree of nonlinearity of the data, and the "spectral variation" of homogeneous figures make model-driven classification methods seriously challenging in th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V20/13G06V20/40G06K9/62G06V10/764G06V10/80
CPCG06V20/13G06V20/41G06F18/241G06F18/253Y02A40/10
Inventor 钟燕飞胡鑫王心宇
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products