Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-tense remote sensing image land cover classification method based on convolutional neural network

A convolutional neural network and surface coverage technology, applied in the field of multi-temporal remote sensing image surface coverage classification, can solve the problem of low surface coverage classification accuracy, and achieve the effects of meeting the needs of automated production, improving prediction accuracy, and improving classification accuracy.

Active Publication Date: 2021-10-19
广西壮族自治区自然资源信息中心
View PDF17 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the problems in the prior art that the accuracy of land cover classification is not high, and to provide a multi-temporal remote sensing image land cover classification method based on convolutional neural network, which can achieve higher land cover classification accuracy requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-tense remote sensing image land cover classification method based on convolutional neural network
  • Multi-tense remote sensing image land cover classification method based on convolutional neural network
  • Multi-tense remote sensing image land cover classification method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] Such as figure 2 The shown multitemporal remote sensing image land cover classification method based on convolutional neural network includes the following steps:

[0045] Step 1: Input the high-resolution remote sensing image data of the old and new tenses and perform corresponding preprocessing, including the production of training samples and the augmentation of training sample data; the production of the training samples includes internal refinement, field verification and There are three stages of sample cutting; the training sample data augmentation includes image data augmentation and geometric augmentation; the image data augmentation includes pixel coordinate geometric transformation data augmentation and pixel value transformation data augmentation.

[0046] Step 2: Construct a convolutional neural network, including performing feature extraction on the two temporal data input in step 1, and upsampling the high-dimensional feature map; upsampling includes dil...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-tense remote sensing image land cover classification method based on a convolutional neural network. The method comprises the following steps: step 1, inputting old tense and new tense high-resolution remote sensing image data and carrying out corresponding preprocessing; step 2, constructing a convolutional neural network, including performing feature extraction on the two pieces of tense data input in the step 1, and performing up-sampling on a high-dimensional feature map; step 3, introducing a tense correction algorithm, and performing tense correction by comparing the classification result of the old tense with a reference true value so as to obtain a correction amount used for correcting classification errors; and step 4, performing post-processing on a land cover classification result, and eliminating a plurality of fragment pattern spots generated after tense correction is performed on an output result of the new tense due to incomplete overlapping of prediction pattern spots of the new tense and the old tense, so that land cover classification precision improved. The method can further improve the land cover classification precision.

Description

technical field [0001] The invention relates to the technical field of remote sensing image processing and application, in particular to a multi-temporal remote sensing image land cover classification method based on a convolutional neural network. Background technique [0002] Land cover is an important geographic information resource, and it is an indispensable and important basic information for natural resource monitoring, land and space planning, geographic and national conditions survey, and macro-control analysis. In recent years, with the rapid development of 3S technology, the ability to obtain high-resolution satellite image data has been continuously enhanced, and the spectral resolution, spatial resolution, and temporal resolution of satellite imagery have been continuously improved, providing a solid data basis for land cover classification. However, in practical applications, a large amount of image data still relies on manual interpretation and processing, whi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/2415
Inventor 朱明姚茂华何永宁吴博贺雨晴秦绍峰姜代炜何敬源谭太恒吴勇
Owner 广西壮族自治区自然资源信息中心
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products