Remote sensing object interpretation method based on focusing weight matrix and variable-scale semantic segmentation neural network

A technology of semantic segmentation and weight matrix, applied in the field of image processing, can solve problems such as inaccurate recognition and achieve the effect of improving accuracy

Active Publication Date: 2019-11-22
WUHAN UNIV OF TECH
View PDF4 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In view of this, the present invention provides a remote sensing object interpretation method based on focusing weight matrix and variable-scale semantic segmentation neural network, to solve or at least partially solve the technical problem of inaccurate recognition in the methods in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing object interpretation method based on focusing weight matrix and variable-scale semantic segmentation neural network
  • Remote sensing object interpretation method based on focusing weight matrix and variable-scale semantic segmentation neural network
  • Remote sensing object interpretation method based on focusing weight matrix and variable-scale semantic segmentation neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The purpose of the present invention is to solve the technical problem of inaccurate recognition due to the inaccurate recognition of the spatial relationship of the remote sensing object by the method in the prior art, and to provide a method for constructing the semantic description obtained by LSTM The connection between the template maps transfers the spatial relationship in the semantic description to the object mask map, so as to realize the semantic segmentation of remote sensing objects and the end-to-end recognition of the spatial relationship.

[0034] In order to achieve the above object, the main idea of ​​the present invention is as follows:

[0035] By designing a remote sensing image variable-scale semantic interpretation model based on FCN, U-Net and LSTM network, it can generate multi-spatial scale remote sensing image descriptions, and at the same time segment objects in the image and identify their spatial relationships end-to-end . In this method, t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a remote sensing object interpretation method based on a focusing weight matrix and a variable-scale semantic segmentation neural network. The remote sensing object interpretation method comprises the following steps: obtaining and preprocessing data; making a thematic map; cutting the sample; designing a multi-spatial-scale remote sensing image labeling strategy; making alabel of the sample set; constructing a multi-scale remote sensing image semantic interpretation model; selecting a training set and a verification set; setting training parameters; training model; designing based on a remote sensing object recognition algorithm of a focusing weight matrix; and verifying and analyzing the effect of the variable-scale remote sensing image semantic interpretation model. According to the invention, the LSTM is constructed, so that the relation between nouns in the obtained semantic description and an object mask graph obtained by semantic segmentation is obtained, and the spatial relationship in semantic description is transferred between the object mask graphs, so that the variable-scale semantic segmentation of the remote sensing object and the end-to-end identification of the spatial relationship are realized, and the image classification and identification work in the remote sensing application field is guided to step forward to a higher step.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a remote sensing object interpretation method based on a focusing weight matrix and a variable-scale semantic segmentation neural network. Background technique [0002] Remote sensing image classification and remote sensing object recognition are the current research hotspots in remote sensing technology. With the development of artificial intelligence technology, deep neural network has been widely used in high-resolution remote sensing image analysis, and has become an effective processing method. [0003] At present, the traditional LSTM model based on the Attention mechanism is mainly applied to the semantic description of ordinary digital images. In the process of implementing the present invention, the inventor of the present application found that the method of the prior art has at least the following technical problems: [0004] Spatial location uncertainty: At d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/62
CPCG06V20/13G06V10/267G06F18/214
Inventor 崔巍何新姚勐王梓溦郝元洁穆力玮马力陈先锋史燕娟胡颖申雪皎
Owner WUHAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products