Mobile robot V-SLAM method of three-stage point cloud registration method

A mobile robot, point cloud registration technology, applied in instruments, image data processing, 3D modeling, etc., can solve problems such as large matching error and point cloud registration algorithm falling into local optimum.

Pending Publication Date: 2019-02-05
CHONGQING UNIV OF POSTS & TELECOMM
View PDF4 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention aims to solve the problems of easy falling into local

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile robot V-SLAM method of three-stage point cloud registration method
  • Mobile robot V-SLAM method of three-stage point cloud registration method
  • Mobile robot V-SLAM method of three-stage point cloud registration method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The technical solutions in the embodiments of the present invention will be described clearly and in detail below with reference to the drawings in the embodiments of the present invention. The described embodiments are only some of the embodiments of the invention.

[0046] The technical scheme that the present invention solves the problems of the technologies described above is:

[0047] like figure 1 As shown, the present invention provides a mobile robot V-SLAM method of a three-stage point cloud registration method, which includes the following steps:

[0048] S1. Use the Kinect camera to obtain the RGB color information and Depth information of the surrounding environment;

[0049] S2. Obtain the internal parameter matrix of Kinect to perform rigid body transformation between the RGB camera and the depth camera, and generate three-dimensional point cloud data by combining its pixel coordinates with the Depth data of the point. Using the pixel coordinates of the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a mobile robot V-SLAM method of a three-stage point cloud registration method. The V-SLAM method comprises the following steps: S1, acquiring RGB information and Depth information of the surrounding environment; 2, generate three-dimensional point cloud data; 3, extract that ORB image feature from the obtained RGB image, and match the feature of the point set elements by adopting FLANN; S4, screening the point pairs of the RGB map through the RANSAC sampling strategy so as to obtain the interior points to complete the preprocessing stage; S5, completing the initial registration stage of the point cloud by adopting the corresponding point double distance threshold method based on the rigid body transformation consistency; S6, a dynamic iterative angle factor ICP precision registration method is introduced to complete the precision registration phase when the initial posture is good; S7, the key frame selection mechanism of sliding window and random sampling is introduced in the back end, and the g2o algorithm is used to optimize the robot posture trajectory, so as to realize the reconstruction of three-dimensional point cloud environment The invention can improve the registration accuracy and registration efficiency of the point cloud map in the environmental three-dimensional reconstruction.

Description

technical field [0001] The invention belongs to the field of mobile robot V-SLAM, especially a method for point cloud map reconstruction. Background technique [0002] In recent years, with the rapid development of mobile robot intelligence and computer vision processing technology, mobile robots are widely used in industries such as home life, catering, virtual reality and autonomous navigation. Visual simultaneous localization and mapping (V-SLAM) refers to the reconstruction of the surrounding environment by the visual sensor carried by the mobile robot and the estimation of its own position during the traveling process of the mobile robot. With the low cost and high precision of visual sensors, more and more applications are used in robots, especially the emergence of Kinect depth cameras, which have the advantages of obtaining rich environmental information and fast update speed, and are well applied in mobile robots V- In the SLAM system. [0003] V-SLAM is generally...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/05G06T17/20G06T7/33G01C21/32
CPCG06T7/33G06T17/05G06T17/20G01C21/32G06T2207/10028
Inventor 胡章芳漆保凌罗元张毅方继康
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products