Hyperspectral image classification method and device and electronic equipment

A technology of hyperspectral images and classification methods, applied in the field of devices and electronic equipment, and classification methods of hyperspectral images, can solve the problems of unmarked, large coverage area of ​​hyperspectral images, and reduced accuracy of classification results, so as to improve the accuracy of sexual effect

Active Publication Date: 2021-06-11
YUNNAN UNIV
View PDF14 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In related technologies, it is usually based on the fact that the pixels are complete known categories, and each pixel can be classified using the existing label value to mark and classify the hyperspectral image. However, due to the large coverage area of ​​the hyperspectral image , usually the existing tag values ​​cannot cover all the categories in the image, and there must be unmarked categories (unknown categories)
In existing classification methods, unknown classes are usually assigned to currently known class label values, resulting in larger estimated areas of known classes and reducing the accuracy of classification results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hyperspectral image classification method and device and electronic equipment
  • Hyperspectral image classification method and device and electronic equipment
  • Hyperspectral image classification method and device and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0073] (1) According to the reconstruction loss value, calculate the probability value that the reconstruction loss value is greater than the preset loss threshold through the pre-established probability model of the reconstruction loss value;

[0074] The above probabilistic models include:

[0075]

[0076] G in the above formula ξ,u (v) represents the probability model; v represents the reconstruction loss value; ξ represents the shape parameter; v represents the scale parameter.

[0077] The above probability model represents the conditional probability that the reconstruction loss function value is greater than the preset loss threshold; specifically, it can be obtained in the following way:

[0078] Obtain multiple reconstruction loss values ​​obtained during training and testing, and create a histogram of multiple reconstruction loss values, such as Figure 6 The distribution histogram of the reconstruction loss value is shown, (a) in the figure represents the hist...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a hyperspectral image classification method and device and electronic equipment, and the method comprises the steps: inputting a target hyperspectral image into a pre-trained feature extraction network model, and outputting the image features of the target hyperspectral image; determining an initial category of each pixel point in the target hyperspectral image based on the image features; inputting the image features into a pre-trained image reconstruction network model, and outputting a reconstructed image of the image features; based on the reconstructed image and the target hyperspectral image, determining a reconstruction loss value of each pixel point in the reconstructed image; and determining a final category of the pixel points according to the reconstruction loss value and the initial category. In the mode, the extracted image features are reconstructed to restore the features of the target hyperspectral image as far as possible, the reconstructed image is obtained, the pixel points of the unknown class can be determined according to the reconstructed image and the target hyperspectral image, and the accuracy of the classification result is improved.

Description

technical field [0001] The invention relates to the technical field of classification and recognition of hyperspectral images, in particular to a classification method, device and electronic equipment for hyperspectral images. Background technique [0002] A hyperspectral image is a three-dimensional light cube with rich spectral information and spatial information, which can be regarded as a three-dimensional image with one-dimensional spectral information in addition to ordinary two-dimensional images; the classification of hyperspectral images refers to the Extract spectral information or spatial information to classify each pixel in the image. In related technologies, it is usually based on the fact that the pixels are complete known categories, and each pixel can be classified using the existing label value to mark and classify the hyperspectral image. However, due to the large coverage area of ​​the hyperspectral image , usually the existing label values ​​cannot cove...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V20/13G06V10/40G06N3/045G06F18/241
Inventor 周浩黄钢平袁国武高赟普园媛余鹏飞黎时冲肖克豪
Owner YUNNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products