An Automatic Building Extraction Method Fused with Geometry Perception and Image Understanding

An image understanding and automatic extraction technology, applied in biological neural network models, instruments, calculations, etc., can solve the problems of lack of detailed information in classification results, increase in the amount of network calculations and parameters, and lack of adequate mining of LiDAR data spatial geometric information. Achieve the effect of improving the extraction effect, improving the extraction accuracy, and efficient automatic extraction

Active Publication Date: 2022-06-07
WUHAN UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current network learning strategy that uses the digital surface model obtained from LiDAR data as a network-assisted or additional image feature to extract buildings not only lacks sufficient mining of the spatial geometric information of LiDAR data, but also greatly increases the calculation of the network. quantity and parameter quantity
In addition, the output classification results lack detailed information, which is the main problem faced by the fully convolutional neural network structure.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An Automatic Building Extraction Method Fused with Geometry Perception and Image Understanding
  • An Automatic Building Extraction Method Fused with Geometry Perception and Image Understanding
  • An Automatic Building Extraction Method Fused with Geometry Perception and Image Understanding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to facilitate the understanding and implementation of the present invention by those of ordinary skill in the art, the present invention will be further described below with reference to the accompanying drawings and specific embodiments. invention.

[0038] see figure 1 , a kind of building automatic extraction method that integrates geometric perception and image understanding provided by the present invention, is characterized in that, comprises the following steps:

[0039] Step 1: remote sensing data selection and preprocessing to obtain an experimental data set consisting of a first data set, a second data set, a normalized digital surface model, and a real type label of a building; the first data set includes several red, Multi-band remote sensing images of green and blue bands, the second dataset includes multi-band remote sensing images of several red, green and near-infrared bands; part of the first dataset is used as training and validation datasets...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an automatic building extraction method that integrates geometric perception and image understanding. Firstly, remote sensing data is selected and preprocessed, and then the obtained multi-band remote sensing image and normalized digital surface model are used as input feature maps, which are input to Feature learning is performed based on the encoder side of the improved deep residual network; then the high-level semantic features are input into the multi-scale efficient perception module of the network, and the low-level semantic features are combined with the high-level learning output of the multi-scale efficient perception module of the building. Hierarchical semantic features are input to the network decoder to obtain a binary object classification map; finally, the binary object classification map output by the network decoder is input to the network loss function layer to "drive" the network to learn to fit the target task The optimal weight parameters of , and finally the final building classification map is output by the network output layer, and the end-to-end automatic building extraction is completed. The present invention can achieve the best effect in terms of building extraction accuracy and extraction efficiency.

Description

technical field [0001] The invention belongs to the application of deep learning technology in the field of remote sensing image intelligent interpretation, relates to an automatic building extraction method integrating geometric perception and image understanding, and in particular relates to a method for extracting buildings from multi-source remote sensing data (aerial remote sensing image, airborne laser A method for automatically extracting buildings from radar data). Background technique [0002] Remote sensing is a modern applied technology science developed with the advancement of modern science and technology in the 1960s. It acquires ground target data by applying various sensors (photographs, scanners, and lidars, etc.), and processes the data. and analysis to obtain useful information of the detected target, so as to realize the understanding and description of the target. According to the height of the platform, remote sensing is generally divided into aerospac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V10/774G06V20/17G06K9/62G06N3/04
CPCG06V20/176G06V20/194G06N3/045G06F18/214
Inventor 张展郑先伟龚健雅陈晓玲徐旭
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products