Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Boundary extraction method of missing area based on airborne lidar point cloud data

A technology for point cloud data and boundary extraction, which is applied in image data processing, image analysis, image enhancement, etc., and can solve problems such as data missing areas

Active Publication Date: 2022-06-03
TIANJIN UNIV
View PDF13 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, during the acquisition process of airborne LiDAR point cloud data, due to the occlusion of surface objects and the absorption of water bodies, there will be missing areas in the acquired data, which cannot provide a complete data source for subsequent processing and application. Therefore, in the data During subsequent processing and use, it needs to be repaired

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Boundary extraction method of missing area based on airborne lidar point cloud data
  • Boundary extraction method of missing area based on airborne lidar point cloud data
  • Boundary extraction method of missing area based on airborne lidar point cloud data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] A method for boundary extraction of missing area based on airborne LiDAR point cloud data, see figure 1 , the method includes the following steps:

[0032] 101: Perform gridding processing on the original airborne LiDAR point cloud data, obtain the gridded matrix with missing areas, and define the matrix as a grid; use the seed method to traverse the grid to obtain distinguishable missing region;

[0033] 102: Expand each missing area, obtain a mesh set surrounding the missing area, count the point cloud data in the mesh, and use it as the initial boundary feature point set of the missing area;

[0034] 103: Use the Kd-Tree method to establish the topological relationship between the airborne LiDAR point cloud data, and obtain the k-neighborhood points to be judged;

[0035] 104: Determine whether the point is a boundary feature point by the distribution uniformity of each point in the initial boundary feature point set and its k neighbor points; use the nearest point...

Embodiment 2

[0046] The specific calculation formula is as follows, figure 2 , image 3 The scheme in embodiment 1 is further introduced, see the following description for details:

[0047] 201: Airborne LiDAR point cloud data gridding;

[0048] The main purpose of meshing point cloud data is to quickly locate the missing area and obtain the point cloud data surrounding the missing area. The meshing process can be divided into three steps:

[0049] 1) First, project the point cloud data to the plane;

[0050] 2) Determine the grid division scale (the number of rows R and the number of columns C) according to certain conditions;

[0051] 3) For each point cloud data, according to its position in the plane projection, assign it to each mesh hole in the grid M. Schematic diagram of point cloud data gridding is attached figure 2 shown.

[0052]The most important step in the meshing process is to determine the division scale of the mesh. When determining the mesh division scale, the si...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for extracting the boundary of a missing area based on airborne LiDAR point cloud data, comprising the following steps: gridding the original airborne LiDAR point cloud data, obtaining a gridded matrix with missing areas, Define the matrix as a grid; use the seed method to traverse the grid to obtain the missing regions that can be distinguished from each other; expand each missing region to obtain the mesh set surrounding the missing region; use the Kd-Tree method to establish a machine Load the topological relationship between LiDAR point cloud data to obtain the k-neighborhood points to be judged; judge whether the point is a boundary feature point through the distribution uniformity of the point-to-be-judged point and k-neighborhood points; use the nearest point search method to search for scattered boundaries The feature points are connected to obtain the ordered boundary feature points of the missing areas in the airborne LiDAR point cloud data. The invention can obtain the airborne LiDAR point cloud data missing area boundaries with complete details and independent of each other.

Description

technical field [0001] The invention relates to the field of airborne LiDAR three-dimensional point cloud data processing, in particular to a method for boundary extraction of missing areas based on airborne LiDAR point cloud data. Background technique [0002] With the advancement of science and technology and the development of society, the concept of "digital earth" has penetrated into all walks of life, and people gradually realize that spatial geographic information is essential to realize an information society. It can be said that space Geographic information is the basis of all geographic information applications. At this stage, spatial geographic information is mainly obtained through traditional photogrammetry, but obtaining spatial geographic information through traditional photogrammetry has a slow generation cycle and poor data quality, which can no longer meet the growing social needs. , Real-time acquisition of spatial geographic information has become a deve...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/13G06T7/181
CPCG06T7/13G06T7/181G06T2207/10028Y02A90/30
Inventor 黄帅张广运黄景金张荣庭周祥周国清
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products