Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing image fishpond extraction method based on row-column self-attention full convolutional neural network

A convolutional neural network and remote sensing image technology, applied in the field of remote sensing image fish pond extraction based on row-column self-attention full convolutional neural network, can solve the problem of incomplete extraction range, inability to automatically obtain geometric shapes, and blurred edges of fish ponds. And other issues

Active Publication Date: 2020-10-30
CHINA UNIV OF PETROLEUM (EAST CHINA)
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, the boundary of fish ponds has a more regular shape, and due to various factors such as narrow edges and indeterminate shapes, traditional classification algorithms extract fish pond edges that are blurred and incomplete, and cannot automatically obtain a more regular geometric shape

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing image fishpond extraction method based on row-column self-attention full convolutional neural network
  • Remote sensing image fishpond extraction method based on row-column self-attention full convolutional neural network
  • Remote sensing image fishpond extraction method based on row-column self-attention full convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In order to facilitate those of ordinary skill in the art to understand and implement the present invention, the present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the implementation examples described here are only used to illustrate and explain the present invention, and are not intended to limit this invention.

[0033] The self-attention full convolutional neural network structure diagram of ranks and columns given by the present invention is as follows: figure 1 As shown, each rectangular box represents a neural network layer. Among them, conv1, conv2_x, conv3_x, conv4_x, and conv5_x are the five sets of convolutional layers of ResNet101 respectively; RCSA represents the row-column bidirectional GRU self-attention model; ASPP-RC represents the hollow space convolution pooling pyramid combined with the row-column bidirectional GRU self-attention model Model; Upsampl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a remote sensing image fishpond extraction method based on a self-attention full convolutional neural network, and the method comprises the basic steps: calculating an NDWI through employing a remote sensing image, carrying out the preliminary classification through employing the NDWI, and then manufacturing a remote sensing image sample set and a calibration sample set; constructing a self-attention full convolutional neural network and training a model, and predicting the remote sensing image by using the trained model to obtain probability distribution of each pixelcategory of the remote sensing image; and finally, fusing the probability distribution of each pixel category with an NDWI classification result to obtain a classification result. On the basis of using a full convolutional neural network, a row-column bidirectional GRU self-attention model is designed, the method has the characteristics of high accuracy and complete edge for fishpond extraction ina remote sensing image, and the method can be applied to fishpond automatic extraction, change detection and the like.

Description

technical field [0001] The invention belongs to the field of remote sensing target recognition, in particular to a method for extracting fish ponds from remote sensing images based on row-column self-attention full convolutional neural networks. Background technique [0002] In the interpretation of remote sensing images, the most important thing is the identification of image information. Human perception of external vision is a unified whole, including the complete perception of the shape, size, color, distance and other properties of each object in the scene according to the precise time and space orientation. Remote sensing information mainly includes the spectral information of ground objects. However, different objects are also clearly reflected in remote sensing images due to their different structural shapes and spatial positions. When people make visual judgments, in addition to feeling the difference in hue and hue, they can also obtain a lot of information throug...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/13G06N3/045G06F18/214G06F18/241
Inventor 曾喆游嘉程王迪黄建华刘善伟
Owner CHINA UNIV OF PETROLEUM (EAST CHINA)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products