Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hyperspectral image classification method based on depth feature cross fusion

A hyperspectral image, cross-fusion technology, applied in the field of hyperspectral image classification, hyperspectral image classification based on depth feature cross-fusion, can solve the problem of not considering the correlation of depth features, the loss of spatial features, etc., to solve the loss of spatial features , Enhance the effect of representational ability

Active Publication Date: 2020-05-22
XIDIAN UNIV
View PDF6 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that it does not consider the strong correlation between deep features, and as the depth of the network increases, the serial convolutional neural network will extract features from high spatial resolution to low spatial resolution, resulting in spatial features. loss of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hyperspectral image classification method based on depth feature cross fusion
  • Hyperspectral image classification method based on depth feature cross fusion
  • Hyperspectral image classification method based on depth feature cross fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments.

[0034] refer to figure 1 , an implementation flow chart of a hyperspectral image classification method based on deep feature cross-fusion, the implementation steps of the present invention are described in detail:

[0035] Step 1, input hyperspectral data and preprocess it.

[0036] First, input the hyperspectral data, read the data to obtain the hyperspectral image and its corresponding classification label; where the hyperspectral image is a three-dimensional cube data of h×w×b, and the corresponding category label is a two-dimensional category label of h×w Data, where h represents the height of the hyperspectral image, w represents the width of the hyperspectral image, and b represents the number of spectral bands of the hyperspectral image.

[0037] Each spectral dimension of the hyperspectral data is then normalized:

[0038] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention puts forward a hyperspectral image classification method based on depth feature cross fusion, and mainly solves the problem of spatial feature loss of a traditional convolutional neuralnetwork during hyperspectral data classification. According to the technical scheme, the method comprises the following steps: 1, reading hyperspectral data and preprocessing each spectral band; 2, constructing a data sample by using the preprocessed hyperspectral data, and generating a training set and test set data; 3, constructing a hyperspectral image classification network based on depth feature cross fusion; 4, training the network by using the training set data; and 5, performing classification prediction on the test set data by using the trained network. According to the method, depthfeatures of different branch stages and different scales are fused for multi-channel original data, information exchange is continuously carried out among multi-scale representation, and then the depth feature expression capability of the model is improved; the multi-scale spatial information of different layer depth features of the hyperspectral data is effectively utilized, and the classification precision is improved.

Description

technical field [0001] The invention belongs to the technical field of remote sensing information processing, and further relates to a hyperspectral image classification method, specifically a hyperspectral image classification method based on depth feature cross fusion, which can realize ground object recognition and is used in the fields of environmental monitoring, geological exploration and the like. Background technique [0002] With the development of spectral imaging technology, the spatial resolution of hyperspectral images has been continuously improved, and there are more and more spectral bands, making the information of hyperspectral images more and more abundant. Rich spectral and spatial features make hyperspectral image classification more promising, and at the same time, the classification accuracy requirements are more stringent. [0003] Hyperspectral image classification technology mainly includes data feature engineering and classification. Feature engine...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/253G06F18/24G06F18/214
Inventor 焦李成李玲玲王科樊龙飞刘旭冯志玺朱浩唐旭郭雨薇陈璞花
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products