Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hyperspectral image classification method based on spectral-spatial cooperation of deep convolutional neural network

A convolutional neural network and hyperspectral image technology, applied in the field of space-spectrum joint hyperspectral image classification, can solve the problems of affecting accuracy, large amount of calculation for dimension reduction processing, and loss of spectral information, so as to improve classification accuracy and solve classification problems. The effect of low precision

Active Publication Date: 2016-02-10
陕西令一盾信息技术有限公司
View PDF3 Cites 75 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the existing methods for extracting spatial spectral features of hyperspectral images using deep models are very complicated. First, it is often necessary to reduce the dimensionality of the original hyperspectral image in spectral space, and then The information after dimensionality reduction is combined with the spectral information to obtain the spatial spectral feature
Dimensionality reduction is computationally intensive, and certain spectral information is lost, affecting accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hyperspectral image classification method based on spectral-spatial cooperation of deep convolutional neural network
  • Hyperspectral image classification method based on spectral-spatial cooperation of deep convolutional neural network
  • Hyperspectral image classification method based on spectral-spatial cooperation of deep convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] Now in conjunction with embodiment, accompanying drawing, the present invention will be further described:

[0022] Step 1 Input the specular image data, according to the formula Normalize the data. Where ij represents the coordinate position s represents the spectrum segment, generally 100-240 spectrum segment, x max 、x min represent the maximum and minimum values ​​in the 3D hyperspectral data, respectively.

[0023] Step 2 Extract the original spatial spectral features, and in the hyperspectral image, a total of nine pixel vectors of the central pixel and eight neighboring pixels Extracted as the original spatial spectral feature of the center pixel at (i, j) position.

[0024] Step 3 randomly extracts a small amount of labeled data from the data extracted in step 2 as the data for training CNN.

[0025] Step 4 Construct a convolutional neural network, take the original spatial spectral feature extracted in step 2 as input, use a one-dimensional vector as the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a hyperspectral image classification method based on spectral-spatial cooperation of a deep convolutional neural network, which leads the conventional deep convolutional neural network applied to a two-dimensional image into the three-dimensional hyperspectral image classification problem. Firstly, the convolutional neural network is trained by using a small volume of label data, and a spectral-spatial feature of a hyperspectral image is autonomously extracted by using the network without carrying out any compression and dimensionality reduction processing; then, a support vector machine (SVM) classifier is trained by using the extracted spectral-spatial feature so as to classify an image; and finally, the trained neural network is combined with the trained classifier, the neural network extracts a spectral-spatial feature of a to-be-classified target and the classifier determines a specific category of the extracted spectral-spatial feature so as to acquire a structure (DCNN-SVM) that can autonomously extract the spectral-spatial feature of the hyperspectral image and carry out classification to the spectral-spatial feature, thereby forming a set of hyperspectral image classification method.

Description

technical field [0001] The invention belongs to the technical field of remote sensing information processing, and relates to a hyperspectral image classification method, in particular to a hyperspectral image classification method based on a space-spectrum combination of a deep convolutional neural network. Background technique [0002] Hyperspectral remote sensing images have high spectral resolution, multiple imaging bands, and large amounts of information, and are widely used in remote sensing applications. Hyperspectral image classification technology plays an important role in these applications. The features used for classification are extracted from the original hyperspectral image. This step has a great impact on the classification accuracy of hyperspectral images. Improve the classification accuracy; on the contrary, the classification features with poor robustness will significantly reduce the classification effect. [0003] In recent years, deep learning has made...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/2411
Inventor 李映张号逵刘韬
Owner 陕西令一盾信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products