Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Recognition Method of Remote Sensing Artificial Objects Based on Object Semantic Tree Model

A technology of artificial ground objects and semantic trees, applied in character and pattern recognition, image data processing, instruments, etc.

Active Publication Date: 2011-12-14
济钢防务技术有限公司
View PDF4 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a remote sensing man-made object recognition method based on the object semantic tree model, to solve how to comprehensively utilize image space structure and target category semantic information to carry out automatic recognition of man-made objects in high-resolution remote sensing images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Recognition Method of Remote Sensing Artificial Objects Based on Object Semantic Tree Model
  • A Recognition Method of Remote Sensing Artificial Objects Based on Object Semantic Tree Model
  • A Recognition Method of Remote Sensing Artificial Objects Based on Object Semantic Tree Model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] The present invention is further described below in conjunction with embodiment and accompanying drawing.

[0068] figure 1 It is a schematic flow chart of the method for man-made object target recognition based on the object semantic tree model of the present invention, and the specific steps include:

[0069] The first step is to establish a representative image set of high-resolution remote sensing surface objects:

[0070] The pictures in the dataset of remote sensing man-made objects are obtained from the Internet. The resolution of these images is around 1 meter. The data set includes eight categories of targets, including aircraft, oil tanks, ships, stadiums, aircraft carriers, buildings, roads, and vegetation, and each category consists of 200 images. The average size of the images is approximately 300x300 and 300x450 pixels in size. Such as image 3 shown.

[0071] When making a dataset image, it is necessary to mark the actual ground object category (Gro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a remote-sensing artificial ground object identifying method based on a semantic tree model of an object. The remote-sensing artificial ground object identifying method comprises the steps of: establishing a remote-sensing ground object representative image set; splitting images in the remote-sensing ground object representative image set by adopting a multi-scale method, and obtaining an object tree of each image; modeling for each node of each object tree by adopting an LDA (linear discriminant analysis) method, and computing implied semantic features contained in the tree node objects; obtaining the object tree sets of all the images in the representative set to learn each pair of object trees in a matching way, and extracting the common maximum sub-trees from the object trees; combining all the common maximum sub-trees together by adopting a step-by-step adding method, and forming an object semantic tree of the category of the described ground object; and identifying the artificial ground object according to the object semantic tree and obtaining the area in which the ground object is positioned. The remote-sensing artificial ground object identifying method disclosed by the invention can be used for mostly effectively processing the artificial ground objects in the condition of high-resolution remote-sensing images; the identification result is accurate, the robustness is good, the applicability is high, and manual work is reduced.

Description

technical field [0001] The invention relates to a method for target recognition in the field of remote sensing image information processing, in particular to a method for recognizing man-made objects in high-resolution remote sensing images by constructing an object semantic tree model of the target, which is a comprehensive utilization A method for man-made object recognition in high-resolution remote sensing images based on image spatial structure and target category semantic information. Background technique [0002] With the rapid development of remote sensing image processing technology, the ground resolution of some satellite images has reached the meter or even centimeter level, which can provide a large number of multi-temporal and multi-band object observation data in a timely and accurate manner. Relying solely on human vision to interpret remote sensing images, extract and identify man-made objects in the images, it takes a lot of time, a long cycle, and poor accu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/66G06T7/00
Inventor 孙显付琨王宏琦
Owner 济钢防务技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products