Multi-source remote sensing image classification method based on two-way attention fusion neural network

A remote sensing image and classification method technology, applied in the field of image processing, can solve the problems of inability to communicate and fuse feature information, feature redundancy, and high feature dimension, and achieve accurate fusion classification results, reduce redundancy and dimension, and provide rich information Effect

Active Publication Date: 2019-07-09
XIDIAN UNIV
View PDF6 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of method can take advantage of the advantages of convolutional neural networks to extract effective features from hyperspectral images and LiDAR images, and apply these features to classification problems. However, this model still has certain shortcomings when performing fusion classification. Advantages: firstly, the hyperspectral image network and LiDAR data network are separated from each other, and the circulation and fusion of feature information cannot be performed well; secondly, the features of the two images extracted by the convolutional neural network are only input to the The fully connected layer classifies, and does not filter and fuse features, which will lead to excessively high dimensionality of features and feature redundancy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-source remote sensing image classification method based on two-way attention fusion neural network
  • Multi-source remote sensing image classification method based on two-way attention fusion neural network
  • Multi-source remote sensing image classification method based on two-way attention fusion neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The specific examples and effects of the present invention will be described in further detail below in conjunction with the accompanying drawings.

[0030] refer to figure 1 , the implementation steps of this example are as follows:

[0031] Step 1. Input a registered hyperspectral image and a LiDAR image to construct a training sample set and a test sample set.

[0032] 1a) Input a hyperspectral remote sensing image, which contains M marked pixels and N unmarked pixels, each pixel of the image is a sample, and M marked samples form a hyperspectral training sample set H, N Unmarked samples constitute the hyperspectral test sample set He. In this example, the number M of marked samples is 66485, and the number N of unmarked samples is 598360;

[0033] 1b) Input the lidar LiDAR image, where the number of marked pixels is M, and the coordinates are the same as those of the marked pixels in the hyperspectral image, the number of unmarked pixels is N, and the coordinates ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-source remote sensing image classification method based on a two-way attention fusion neural network, and mainly solves the problem of low classification precision of multi-source remote sensing images in the prior art. The implementation scheme comprises the following steps: 1) preprocessing and dividing hyperspectral data and laser radar data to obtain a trainingsample and a test sample; 2) designing an attention fusion layer based on an attention mechanism to carry out weighted screening and fusion on spectral data and laser radar data, andestablishing a two-way interconnection convolutional neural network, (3) training the interconnection convolutional neural network by taking multiple types of cross entropies as a loss function to obtain a trained network model, and (4) predicting a test sample by using the trained model to obtain a final classification result. The method can extract the features of the multi-source remote sensing data and effectively fuse and classify the features, improves the problem of overhigh dimension in fusion, improves the average classification precision, and can be used for fusing remote sensing images obtained by two different sensors.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a remote sensing image classification method, which can be used to fuse and classify remote sensing images obtained by two different sensors. Background technique [0002] In recent years, the number of remote sensing sensors has grown rapidly, and people can simultaneously obtain multi-source datasets of the same scene, which makes it possible to integrate different information captured by different sensors. For example, multispectral image MSI or hyperspectral image HSI usually consists of multiple spectral channels of the same scene, containing detailed spectral and spatial information, providing the ability to accurately distinguish materials of interest. On the other hand, lidar LiDAR data can represent the elevation and target height information of the scene, which helps to distinguish objects made of similar materials but with different heights. There...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/24133G06F18/253
Inventor 张向荣焦李成梁婷唐旭李阳阳古晶侯彪马文萍
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products