Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hyper-spectral image classification method based on ridgelet and depth convolution network

A hyperspectral image and deep convolution technology, applied in the field of hyperspectral image classification, can solve the problems of small computational complexity, difficult to achieve, difficult to learn effective classification features, etc., to achieve the goal of improving classification accuracy and classification speed Effect

Active Publication Date: 2015-11-18
XIDIAN UNIV
View PDF4 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] The purpose of the present invention is to address the deficiencies of the above-mentioned prior art, and propose a hyperspectral image classification method based on ridgelet and deep convolutional network, so as to solve the difficulty in learning effective classification features in the hyperspectral image classification problem of the prior art , and the problem that traditional deep convolutional networks are difficult to achieve smaller computational complexity, improving the accuracy and speed of spectral image classification

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hyper-spectral image classification method based on ridgelet and depth convolution network
  • Hyper-spectral image classification method based on ridgelet and depth convolution network
  • Hyper-spectral image classification method based on ridgelet and depth convolution network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The technical solutions and effects of the present invention will be described in further detail below with reference to the accompanying drawings.

[0028] refer to figure 1 , the implementation steps of the present invention are as follows:

[0029] Step 1, input image.

[0030] Input a hyperspectral image, as shown in the figure, where 2(a) is the input hyperspectral image, figure 2 (b) is the class label image corresponding to 2(a), and 10% of the pixels from 2(a) are selected as training samples.

[0031] Step 2, extract the spectral information of the training samples.

[0032] Assuming that the spectral dimension of the hyperspectral image input in step 1 is V, for each training sample, extract the spectral value of each dimension of the sample to form a spectral vector f j ,j=1,...,J, J is the number of training samples, spectral vector f j The dimension of is V.

[0033] Step 3, reduce the dimension of the hyperspectral image.

[0034] The methods for i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a hyper-spectral image classification method based on ridgelet and a depth convolution network, and mainly aims to solve the problem that the accuracy is low and the computational complexity is high for hyper-spectral image classification through the existing technology. The method comprises the following steps: (1) selecting training samples in a hyper-spectral image; (2) extracting spectral information and spatial information of the training samples; (3) forming training sample sets based on the spectral information and spatial information; (4) constructing a five-layer depth convolution network, and designing a ridgelet filter to initialize the network; (5) using the training sample sets to train a constructed neural network; and (6) using the trained neural network to classify the other training samples, thus completing image classification. The method has the advantages of high classification accuracy and high classification speed, and can be used in weather monitoring, environmental monitoring, urban planning, and disaster prevention and mitigation.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a hyperspectral image classification method, which can be used for meteorological monitoring, environmental monitoring, land utilization, urban planning, and disaster prevention and mitigation. Background technique [0002] Hyperspectral resolution remote sensing refers to the use of many narrow electromagnetic wave bands to obtain relevant data from objects of interest. Its biggest feature is that while obtaining the two-dimensional space scene information of the target image, it can also obtain high-resolution one-dimensional spectral information representing its physical properties, that is, it has the characteristics of "integration of maps and spectra", which represents the latest achievements of remote sensing. One of the new technologies. The main difference between hyperspectral remote sensing and conventional remote sensing data is that hyperspectra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/2413G06F18/214
Inventor 刘芳石程郝红侠焦李成李玲玲尚荣华马文萍杨淑媛马晶晶
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products