A multi-scale spatial-spectral collaborative classification method for hyperspectral images

A technology of hyperspectral image and classification method, which is applied in the directions of instruments, computing, character and pattern recognition, etc., and can solve the problems of target information loss and low classification accuracy

Active Publication Date: 2021-06-15
HARBIN INST OF TECH
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to solve the problem in the prior art that only using spectral information or spatial information to classify hyperspectral images can easily cause loss of target information and lead to low classification accuracy. The present invention provides a hyperspectral image classification method. Multi-scale spatial-spectral collaborative classification method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0030] Embodiment 1: A method for multi-scale space-spectrum collaborative classification of hyperspectral images described in this embodiment includes the following steps:

[0031] Step 1: Extract features from the original hyperspectral image H to obtain a spectral information set H composed of a subset of bands spec , the band subset has the original hyperspectral image H spectral characteristics;

[0032] Step 2: Spectral information set H spec Extract multi-scale spatial information and obtain multiple sets of multi-scale spatial information data sets H spet , and each group of multi-scale spatial information dataset H spet Dimensions are the same as spectral information set H spec have the same dimensions;

[0033] Step 3: Combine multiple sets of multi-scale spatial information data sets H spet with spectral information set H spec Perform fusion and preliminary classification to obtain the preliminary classification result map Q init ;

[0034] Step 4: Map the p...

specific Embodiment approach 2

[0041] Embodiment 2: The difference between this embodiment and the method for multi-scale space-spectrum collaborative classification of hyperspectral images described in Embodiment 1 is that in the first step, feature extraction is performed on the original hyperspectral image H to obtain Spectral information set H composed of band subsets with the spectral characteristics of the original hyperspectral image H spec The specific process is:

[0042] Use the steepest ascent method to extract the features of the original hyperspectral image H, so as to obtain the spectral information set H composed of the band subsets with the spectral characteristics of the original hyperspectral image H spec .

[0043]In this embodiment, feature extraction is performed by the steepest ascent method to reduce redundancy between bands and complete dimensionality reduction.

specific Embodiment approach 3

[0044] Embodiment 3: The difference between Embodiment 1 and the method for multi-scale space-spectrum collaborative classification of hyperspectral images described in Embodiment 1 is that in step 2, the spectral information set H spec Extract multi-scale spatial information and obtain multiple sets of multi-scale spatial information data sets H spet The specific process is:

[0045] Using adaptive bilateral preservation filter to spectral information set H spec Extract multi-scale spatial information and obtain multiple sets of multi-scale spatial information data sets H spet .

[0046] In this embodiment, the self-adaptive bilateral preserving filter is used. In order to avoid more information redundancy, the extraction of multi-scale spatial information is realized by adjusting the size of the filter window, and at the same time, a large number of parameter selection tasks are effectively avoided. , and finally realize the feature fusion of multi-scale spatial informati...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A multi-scale space-spectrum cooperative classification method for hyperspectral images relates to the field of hyperspectral image information technology processing. The method solves the problem that the single method of classifying hyperspectral images using spectral information or spatial information in the prior art is easy to cause loss of target information, resulting in low classification accuracy. Step 1: Perform feature extraction on the original hyperspectral image H to obtain a spectral information set H composed of band subsets spec ; Step 2: For spectral information set H spec Extract multi-scale spatial information to obtain multiple sets of multi-scale spatial information data sets H spet ; Step 3: Combine multiple sets of multi-scale spatial information datasets H spet with spectral information set H spec Perform fusion and preliminary classification, and obtain the preliminary classification result Figure Q init ; Step 4: Map the preliminary classification results to Q init Decompose into k probability result maps P; Step 5: Post-process the k probability result maps P to obtain the final classification result map O fin , so as to complete the spatial spectrum collaborative classification of hyperspectral images. The present invention is mainly used for space spectrum classification of hyperspectral images.

Description

technical field [0001] The invention relates to the technical processing field of hyperspectral image information. Background technique [0002] Hyperspectral images have the characteristics of "space-spectrum integration" and contain rich spatial and spectral information. At present, many commonly used classifiers only use spectral information for classification, and lack of consideration for spatial information. However, ground targets in images often have Certain spatial correlation and continuity; [0003] The spatial information used in most existing research methods or experiments is often single-scale, that is, the spatial information is usually extracted by a single fixed-window spatial filter. Since the sizes of various ground targets are not the same, only Using a single-scale filter to extract spatial information is likely to cause the loss of ground targets, especially small-scale target information, and these small-scale targets are often quite important. [0...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62
CPCG06F18/2411G06F18/253
Inventor 张钧萍吴斯凡
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products