Double-path convolutional neural network remote sensing classification method taking spatial neighborhood relationship into account

A convolutional neural network and classification method technology, applied in the field of remote sensing image classification, can solve the problem of low classification accuracy, achieve the effect of strengthening sample information, reducing interference, and achieving classification accuracy

Active Publication Date: 2018-04-27
WUHAN UNIV OF TECH
View PDF13 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The technical problem to be solved by the present invention is to provide a two-way convolutional neural network remote sensing classification method that takes into account the spatial neighborhood relationship in view of the defect of low classification accuracy in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Double-path convolutional neural network remote sensing classification method taking spatial neighborhood relationship into account
  • Double-path convolutional neural network remote sensing classification method taking spatial neighborhood relationship into account
  • Double-path convolutional neural network remote sensing classification method taking spatial neighborhood relationship into account

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0037] Such as figure 1 As shown, the two-way convolutional neural network remote sensing classification method considering the spatial neighborhood relationship in the embodiment of the present invention takes the main body data and the first-level neighborhood (including the main body area) data as input image data, and takes the spatial neighborhood relationship into consideration. The two-way convolutional neural network remote sensing classification, specifically includes the following steps:

[0038] Data acquisition: Obtain high-resolution remote sensing images of the research area.

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a double-path convolutional neural network remote sensing classification method taking a spatial neighborhood relationship into account. The double-path convolutional neural network remote sensing classification method comprises the steps of: data acquisition; data preprocessing; thematic map production; multi-scale segmentation; visual interpretation; first-order neighborhood acquisition; experimental sample set selection; patch size selection; training data generation; mat format conversion; convolutional neural network structure construction; subject single-path neural network training; first-order neighborhood single-path neural network training; complementarity analysis; and double-path convolutional neural network training taking the spatial neighborhood relationship into account. The invention aims to provide a method for regarding a subject and a first-order neighborhood of the subject as input image data on the same scale, realizing the enhancement of sample information through increasing neighborhood information of samples and reducing interference caused by other types of samples, so as to realize the improvement of classification precision and provide a reference for classification decision-making.

Description

[0001] technology neighborhood [0002] The invention relates to the field of remote sensing image classification, in particular to a two-way convolutional neural network remote sensing classification method in consideration of spatial neighborhood relations. Background technique [0003] Remote sensing image classification is currently a hot research topic in remote sensing technology, and how to accurately and efficiently classify remote sensing images has always been an important research content in the field of remote sensing. In recent years, with the development of artificial intelligence technology, neural network has gradually become an effective classification and processing method for remote sensing images. Compared with the traditional statistical classification methods, the neural network has the characteristics of learning ability and fault tolerance, and does not need to make assumptions about the probability model, so it is suitable for dealing with various pro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06F17/30
CPCG06F16/29G06V20/13G06N3/045G06F18/24
Inventor 崔巍黄智新王飞周琪郑振东
Owner WUHAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products