Three-dimensional point cloud map fusion method based on vision correction

A three-dimensional point cloud and map fusion technology, applied in the field of computer vision, can solve the problems of complexity reaching, weight coefficient overflow, feature matching error, etc., to improve the registration success rate and accuracy, overcome the weight coefficient error, and reduce the calculation cost. Effect

Active Publication Date: 2018-03-20
西安电子科技大学昆山创新研究院
View PDF4 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the normal vector of the point cloud contains less data information, and the change of the perspective of the point cloud map will lead to the calculation of the point cloud normal vector, which will eventually affect the effect of point cloud map fusion
Gelfand provides a method for extracting point cloud volume, but this method does not adapt to point cloud map perspective transformation
Zou calculates the initial value by extracting the Gaussian curvature of the point cloud, but with the change of the perspective of the point cloud map, the Gaussian curvature of the same object changes, which will lead to wrong feature matching, and thus the point cloud map fusion fails
Rusu et al. proposed a method for extracting FPFH features of point clouds, but this method has the problems of weight coefficient overflow and feature matching errors
The RANSAC algorithm is another fusion algorithm. It was originally proposed by Fischler et al. It uses random sampling consistency to match a mathematical model. The algorithm is relatively stable, but the complexity of this method will reach O(n3) in the worst case. When When the number of point clouds is huge, the time consumption of the algorithm is unacceptable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional point cloud map fusion method based on vision correction
  • Three-dimensional point cloud map fusion method based on vision correction
  • Three-dimensional point cloud map fusion method based on vision correction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] The technical solutions of the present invention will be further described in detail below in conjunction with specific embodiments.

[0054] see figure 1 , a 3D point cloud map fusion method based on vision correction, comprising the following steps:

[0055] 1) To preprocess the 3D point cloud map, first input two 3D point cloud map files in pcd format to obtain two point cloud maps to be fused, and these two point cloud maps are the target point cloud map point tgt And the point cloud map point to be fused src , and then preprocess the two point cloud maps, preprocessing includes denoising and sampling;

[0056] 2) For the two point cloud maps in step 1), extract the target point cloud map point tgt And the point cloud map point to be fused src 3D-SIFT keypoints;

[0057] The specific steps of 3D-SIFT key point extraction are:

[0058] I. Detect extreme points in the point cloud scale space, perform different degrees of downsampling on the point cloud map to c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional point cloud map fusion method based on vision correction, which comprises the steps of 1) processing two point cloud maps to be fused; 2) extracting 3D-SIFTkey points of the two three-dimensional point cloud maps; 3) extracting an IPFH feature on the 3D-SIFT key points; 4) searching feature matching points through calculating the Euclidean distance between the feature points; 5) calculating a conversion matrix, and rotating the point cloud maps; 6) and fusing the two point cloud maps together by adopting an ICP algorithm. According to the invention,the SIFT feature is expanded to three-dimensional point clouds, the 3D-SIFT key points are extracted, and thus the robustness of the feature for view angle rotation and conversion is ensured; a problem that the weight coefficient of an original FPFH feature is incorrect is overcome through extracting the IPFH feature, and meanwhile, the feature integrates geometric characteristics of neighborhoodpoints to represent features of a three-dimensional point, so that the stability of the algorithm is greatly improved. Two three-dimensional point cloud maps which are greatly different in view anglecan be fused together according to the processing.

Description

technical field [0001] The invention relates to the field of computer vision, mainly relates to the fusion of three-dimensional point cloud maps, and specifically provides a method based on vision correction, which can be used to solve the problem of point cloud map fusion failure due to excessive differences in viewing angles. Background technique [0002] With the rapid development of stereo cameras, 3D point cloud data has been widely used in the field of computer vision, such as robot building maps, navigation, object recognition, and tracking. When the robot is in a vast external environment or a complex internal environment, although the SLAM construction map can solve the autonomous navigation of the robot, due to the complexity and vastness of the environment, various problems will arise when constructing the map, such as time-consuming in the process of constructing the map. Too long to fail to navigate. Therefore, in this case, multiple robots need to jointly buil...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/20G06K9/46
CPCG06T19/20G06T2207/10028G06T2207/20016G06T2207/20221G06V10/462
Inventor 朱光明张亮沈沛意宋娟张笑
Owner 西安电子科技大学昆山创新研究院
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products