Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds

A laser point cloud and UAV technology, applied in 3D modeling, image analysis, image data processing, etc., can solve the difficulty of reconstructing point cloud registration, and it is difficult to extract feature points of the same name, ground-based images and aerial images. The shooting angle of the base image is quite different, so as to solve the problem of multi-angle observation, reduce the complexity, and improve the accuracy and efficiency.

Inactive Publication Date: 2013-12-04
吴立新 +1
View PDF9 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The difficulty of this idea is that the shooting angles of ground-based images and space-based images are quite different, and conventional SIFT, PCA-SIFT, SURF and other image feature point extraction algorithms are difficult to extract matching feature points of the same name
Although the emergence of the Affine Scale Invariant Feature Transform (ASIFT) algorithm has changed the dilemma of automatic matching of images with large angle differences, the content of ground objects covered by single-scene space-based and ground-based images is very different. That is, ground-based images acquire information about local entities, while UAV images contain richer ground object content. It is still very difficult to register ground laser point clouds and UAV image reconstruction point clouds.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds
  • Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds
  • Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0030] 1. Definition

[0031] 1.1. Coordinate system definition

[0032] PXCS: image coordinate system (pixel coordinate system), a two-dimensional Cartesian coordinate system in units of image pixels;

[0033] RTCS: imaging plane coordinate system (retinal coordinate system), with the principal point of the image as the origin and the imaging plane coordinate system measured in camera physical units;

[0034] CMCS: camera coordinate system (camera coordinate system), with the camera optical center as the origin, the camera optical axis direction as the Z direction, and a space rectangular coordinate system parallel to the imaging plane as the X-Y plane;

[0035] SOCS: the laser scanner's own coordinate system (scanner's own coordinate system); a space rectangular coordinate system with the laser as the origin, the rotation plane as ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a precise registration method of ground laser-point clouds (ground base) and unmanned aerial vehicle image reconstruction point clouds (aerial base). The method comprises generating overlapping areas of the ground laser-point clouds and the unmanned aerial vehicle image reconstruction point clouds on the basis of image three-dimensional reconstruction and point cloud rough registration; then traversing ground base images in the overlapping areas, extracting ground base image feature points through a feature point extraction algorithm, searching for aerial base point clouds in the neighborhood range of the ground base point clouds corresponding to the feature points, and obtaining the aerial base image feature points matched with the aerial base point clouds to establish same-name feature point sets; according to the extracted same-name feature point sets of the ground base images and the aerial base images and a transformation relation between coordinate systems, estimating out a coordinate transformation matrix of the two point clouds to achieve precise registration. According to the precise registration method of the ground laser-point clouds and the unmanned aerial vehicle image reconstruction point clouds, by extracting the same-name feature points of the images corresponding to the ground laser-point clouds and the images corresponding to the unmanned aerial vehicle images, the transformation parameters of the two point cloud data can be obtained indirectly to accordingly improve the precision and the reliability of point cloud registration.

Description

technical field [0001] The invention relates to the technical field of geospatial information collaborative observation, in particular to a fine registration method for three-dimensional point cloud data of air and ground. Background technique [0002] Multi-view 3D reconstruction and air-ground joint monitoring can provide decision support for urbanization management, resource investigation, disaster reduction and emergency response, etc. The 3D model acquisition methods of real objects are mainly divided into active methods and passive methods. Among them, the active method is represented by Light Detection And Ranging (LiDAR); the passive method refers to the three-dimensional reconstruction method based on two-dimensional stereo images. Usually, the ground LiDAR system (terrestrial laser scanning system) can directly acquire ground 3D point cloud data, along with image texture data and station GPS position information. The image-based 3D reconstruction technology has t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T17/00
Inventor 吴立新沈永林
Owner 吴立新
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products