A fully automatic extraction method of high spatial resolution cultivated land plots based on deep learning

A high-spatial-resolution, deep-learning technology, applied in the field of farmland block extraction of high-spatial-resolution remote sensing images, can solve the problem of high cost and achieve cost-saving, high-efficiency, and fine-grained extraction

Active Publication Date: 2022-05-06
SUZHOU ZHONGKE IMAGE SKY REMOTE SENSING TECH CO LTD +3
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the current situation of high cost in the existing plot extraction methods and fully mine the edge information in the image, the present invention proposes a fully automatic extraction method for high spatial resolution remote sensing cultivated land plots based on edge extraction. Learning is a means to improve the HED learning model through a large amount of edge label data, and then extract the edge of the target image, and refine the edge through the convolution calculation result of the canny edge operator, and finally realize the recognition of cultivated land

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A fully automatic extraction method of high spatial resolution cultivated land plots based on deep learning
  • A fully automatic extraction method of high spatial resolution cultivated land plots based on deep learning
  • A fully automatic extraction method of high spatial resolution cultivated land plots based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] figure 1 The main realization idea of ​​the present invention is illustrated, wherein the key technical part includes the automatic retrieval of edge label samples, the training of the deep learning model (HED) and the edge post-processing assisted by the Canny edge operator extraction boundary, and the post-processing includes edge processing Extracted accuracy verification.

[0023] The specific steps are as follows:

[0024] 1) Collect and sort out the images of the research area, establish an image database, and use the sample labels prepared in the previous stage to establish a farmland edge sample database;

[0025] 2) Use the Canny edge operator to extract the boundary of the remote sensing image of the study area;

[0026] 3) Based on the selected edge label sample data and corresponding image data as constraints, the HED model is improved, including the number of network layers and pooling size;

[0027] 4) Use the improved HED model to extract the boundary ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a fully automatic extraction technology of cultivated land with high spatial resolution based on deep learning. The present invention is guided by the edge extracted by the traditional Canny edge operator, based on deep learning theory, through a large number of edge sample labels The HED deep learning model was trained with data, and parameters such as the number of network layers and pooling size of the model were improved; then the new network model was used to extract cultivated land plots from the images of the research area, and finally the boundary extracted by the model was compared with the boundary extracted by the Canny edge operator. Thinning and culling processing are performed to realize the extraction of the skeleton edge of the cultivated land plot. Compared with the traditional manual extraction of plots, the present invention can effectively improve the production efficiency of plot extraction, and can also ensure the uniformity of the edge precision of the plots.

Description

technical field [0001] The invention proposes a method for fully automatic extraction of farmland plots from high spatial resolution remote sensing images, which is mainly applied to the extraction of cultivated land plots from high spatial resolution remote sensing images, and can improve the efficiency of manual extraction of plots. Background technique [0002] One of the important applications of remote sensing technology is the extraction of thematic information. One of the important links in the process of thematic information extraction is the segmentation of remote sensing images, and image segmentation involves the extraction of object edge information. Therefore, how to quickly and accurately extract object edge information is The key steps of remote sensing information processing. [0003] Traditional remote sensing image segmentation with good performance mainly adopts the bottom-up aggregation method. This strategy focuses on the extraction and use of features s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V20/13G06V10/44G06V10/82G06N3/04
CPCG06V20/13G06V10/44G06N3/045
Inventor 夏列钢胡晓东周楠张明杰骆剑承郜丽静陈金律刘浩姚飞
Owner SUZHOU ZHONGKE IMAGE SKY REMOTE SENSING TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products