Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

3D point cloud FPFH characteristic-based real-time three dimensional space positioning method

A positioning method and three-dimensional space technology, applied in image data processing, instruments, calculations, etc., can solve the problems of inaccurate positioning results, errors, and low space-time complexity, and achieve the effect of reducing space-time complexity

Active Publication Date: 2017-01-04
ZHEJIANG UNIV OF TECH
View PDF2 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to overcome the inaccurate acquisition of color information of the robot under poor lighting or complete darkness conditions, which leads to large errors in registration results and inaccurate positioning results, the present invention provides a robot that is not affected by lighting conditions and has high accuracy. The real-time three-dimensional space positioning method based on the 3D point cloud FPFH feature, which is relatively high and has low computational time and space complexity, uses the three-dimensional point cloud and three-dimensional local features to register the front and rear consecutive frames, and directly obtains the three-dimensional position of the mobile robot. The 3D space positioning information of the mobile robot can be obtained in real time. This method can be used for but not limited to the 3D space positioning of the mobile robot based on vision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D point cloud FPFH characteristic-based real-time three dimensional space positioning method
  • 3D point cloud FPFH characteristic-based real-time three dimensional space positioning method
  • 3D point cloud FPFH characteristic-based real-time three dimensional space positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be further described below.

[0019] A real-time three-dimensional spatial positioning method based on 3D point cloud FPFH features, comprising the steps of:

[0020] 1) Obtain 3D point cloud data from the depth camera;

[0021] 2) Point cloud key frame selection. In the first frame, the first frame is regarded as a key frame, and the remaining key frame selection method is to filter the number of corresponding points matched by the threshold value after the point cloud is accurately matched;

[0022] 3) Point cloud preprocessing: First, segment the point cloud, and after segmentation, all possible planes in the point cloud can be accurately given in real time; then, the grid down-sampling method is used to down-sample and filter the plane; finally, the region is filtered, Eliminate areas with fewer key points;

[0023] 4) Feature description: use the ISS algorithm to obtain the key points of the point cloud, and obtain the FPFH features of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a 3D point cloud FPFH characteristic-based real-time three dimensional space positioning method. The method comprises a step 1) of obtaining the 3D point cloud data from a depth camera; a step 2) of selecting the point cloud key frames; 3) a point cloud pre-processing step; 4) a characteristic description step of using an ISS algorithm to obtain the point cloud key points and obtaining the FPFH characteristics of the key points; 5) a point cloud registration step of firstly utilizing a sampling consistency initial registration algorithm to carry out the FPFH characteristic-based initial registration on two point clouds, and then using an ICP algorithm to carry out the secondary registration on an initial registration result; 6) a coordinate transformation step of obtaining a change matrix of the three dimensional space coordinates of a mobile robot, and transforming the coordinate of the current point cloud into an initial position via a transformation matrix; a step 7) of repeating the steps 1) to 6), and calculating the coordinate of the robot relative to the initial position along with the movement of the robot. The method of the present invention has a better accuracy for the real-time positioning of the mobile robot on a bad illumination or completely dark condition.

Description

technical field [0001] The invention relates to the field of positioning of mobile robots, and can greatly compensate for the inability to realize the positioning function due to unstable feature extraction caused by unstable illumination or insufficient light, and the positioning result is also more real-time and accurate. Background technique [0002] Positioning is the process of determining the position of the robot in its operating environment. More specifically, it uses the information of the prior environment map, the current robot pose estimation, and the observation values ​​of the sensor, etc., after certain processing and transformation. , yielding a more accurate estimate of the robot's current position. Using information sensed by sensors to obtain reliable localization is the most basic and important function of autonomous mobile robots, and it is also an important research topic that has attracted much attention and is challenging in mobile robot research. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
CPCG06T2207/10028
Inventor 张剑华邱叶强杨焕清刘盛陈胜勇
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products