Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional point cloud matching method and system

A technology of 3D point cloud and matching method, which is applied in the field of image processing, can solve the problems of large matching error of feature points, complex registration method of color image and depth data, and inability to guarantee high-precision matching of final 3D point cloud, etc., to achieve guaranteed accuracy No loss, achieve high precision matching, and improve the effect of matching accuracy

Active Publication Date: 2018-09-07
SHENZHEN LAUNCH DIGITAL TECH
View PDF5 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The embodiment of the present invention provides a 3D point cloud matching method, which aims to solve the problem that in the existing methods, the matching error of feature points is large in the process of 3D point cloud matching, the registration method of color images and depth data is complicated, and the final result cannot be guaranteed. The problem of high-precision matching of 3D point cloud

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional point cloud matching method and system
  • Three-dimensional point cloud matching method and system
  • Three-dimensional point cloud matching method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0024] figure 1 A flow chart of a 3D point cloud matching method provided in the first embodiment of the present invention is shown, which is described in detail as follows:

[0025] Step S11, detecting the feature points of the captured two frames of color images, pairing the feature points to obtain matching color image feature points;

[0026] The 3D point cloud matching of color image feature points is a necessary step for continuous frame robot motion estimation and loop closure detection. The correctness of 3D point cloud matching directly affects the accuracy of RGB-D visual SLAM. Before matching the 3D point cloud of the feature points of the color image, it is first necessary to detect the feature points on two consecutive frames of color images, and then match the feature points on the two frames of images, filter and remove the wrongly paired feature points, and retain the matched color Image feature points.

[0027] Preferably, said detecting feature points of th...

Embodiment 2

[0073] Figure 5 It shows a structural diagram of a 3D point cloud matching system provided by the second embodiment of the present invention. The 3D point cloud matching system can be applied to various terminals. part. The 3D point cloud matching system includes: a feature point matching unit 51, a calibration parameter determining unit 52, a correction mapping relationship determining unit 53, and a 3D point cloud matching unit 54, wherein:

[0074] The feature point matching unit 51 is used to detect the feature points of the captured two frames of color images, and pair the feature points to obtain matching color image feature points;

[0075] Preferably, the feature point matching unit 51 specifically includes:

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is applicable to the technical field of image processing, and provides a three-dimensional point cloud matching method and system. The three-dimensional point cloud matching method comprises the steps of detecting feature points of two color images, and pairing the feature pints; performing stereo calibration on a color camera and a depth camera to obtain calibration parameters; performing registration on the color images and grayscale images corresponding to depth data according to the calibration parameters to obtain a correction mapping relation; determining true depth data ofthe matched color image feature points according to the correction mapping relation, and then calculating and matching three-dimensional point cloud coordinates of the color image feature points. According to the method, a new mismatch filtering process is added when the feature points of the two color images are matched so as to improve the matching accuracy; a novel stereo calibration based scheme is designed when the true floating-point type depth data corresponding to the feature points is obtained, the mapping process is simplified, it is guaranteed that the precision is not reduced, andhigh-precision matching of the three-dimensional point cloud is realized.

Description

technical field [0001] Embodiments of the present invention belong to the technical field of image processing, and in particular relate to a three-dimensional point cloud matching method and system thereof. Background technique [0002] With the rapid development of robot applications and the urgent needs of human production and life, SLAM (Simultaneous Localization and Mapping, simultaneous positioning and mapping) has become a research hotspot in the fields of machine vision. SLAM refers to establishing a map of an unknown environment through a mobile robot, and at the same time using the map to determine the pose or trajectory of the robot. Visual SLAM, which uses cameras as sensors, has the advantages of low price, rich information, and intuitive mapping, and has become an important branch of SLAM research. After the depth sensor is applied to the field of SLAM, more and more developers are shifting their goals to the field of RGB-D visual SLAM, and the correctness of 3...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/33G06T7/80
CPCG06T2207/10028
Inventor 谷湘煜李星明王利李国胜
Owner SHENZHEN LAUNCH DIGITAL TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products