Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor mobile robot vision SLAM method based on Kinect

A mobile robot and vision technology, applied in instruments, image data processing, computing, etc., can solve problems such as local optimum and large matching error

Inactive Publication Date: 2018-01-16
CHONGQING UNIV OF POSTS & TELECOMM
View PDF1 Cites 67 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention aims to solve the problem that the existing ICP algorithm is easy to fall into the local optimum and the matching error is large in the process of point cloud registration

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor mobile robot vision SLAM method based on Kinect
  • Indoor mobile robot vision SLAM method based on Kinect
  • Indoor mobile robot vision SLAM method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The technical solutions in the embodiments of the present invention will be described clearly and in detail below with reference to the drawings in the embodiments of the present invention. The described embodiments are only some of the embodiments of the invention.

[0042] The technical scheme that the present invention solves the problems of the technologies described above is:

[0043]The invention provides a kind of Kinect-based indoor mobile robot visual SLAM method, it is characterized in that, comprises the following steps:

[0044] S1, use the Kinect camera to collect the color RGB data and depth data of the indoor environment.

[0045] S2, use the SURF feature with rotation and scale invariance for RGB data to detect image key points, use the image "pyramid" to construct the scale space, then locate the key points, determine the main direction of each key point, and generate a feature description For example, the FLANN algorithm is used to achieve fast and e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides an indoor mobile robot vision SLAM method based on the Kinect. The method comprises the following steps: S1: acquiring color RGB data and Depth data of an indoor environment by using a Kinect camera; S2: performing feature detection of RGB data and implementing rapid and effective matching between adjacent images; S3: combining inner parameters of a Kinect camera after calibration and pixel point depth values to convert a 2D image point into a 3D space point, and establishing a corresponding relationship of 3D point cloud; S4: using the RANSAC algorithm to eliminate external points of the point cloud to complete point cloud rough matching; S5: employing an ICP algorithm with a double limit of an Euclidean distance and an angle threshold to complete fine matching of the point cloud; and S6: introducing a weight in a key frame selection, and employing the g2o algorithm to optimize a robot posture, and finally obtaining a robot moving trajectory, and generating a 3D point cloud map. The indoor mobile robot vision SLAM method based on the Kinect can solve the problems that a point cloud registration portion in a vision SLAM system is liable to local optimum and is large the matching error, and therefore the registration accuracy of the point cloud is improved.

Description

technical field [0001] The invention belongs to the field of mobile robot vision SLAM, in particular to a point cloud precise registration method. Background technique [0002] With the continuous deepening of SLAM research, 3D reconstruction has become a boom in the field of robotics research. In essence, all calculations of SLAM are the processing of sensor data. Therefore, the parameterization of the basic equations of different sensors is very different. Common SLAM sensors include IMU, laser and camera. The IMU usually includes a gyroscope and an accelerometer. The gyroscope has a good dynamic response, but it will generate cumulative errors; although the accelerometer does not generate cumulative errors, it is susceptible to vibration interference during machine operation; the laser sensor has high precision and a wide detection range , but it is expensive and consumes a lot of power; cameras are mainly divided into monocular cameras, binocular cameras, and RGB-D cam...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/33G06T7/38G06T7/80G06T7/55G06T7/73
Inventor 蔡军陈科宇曹慧英陈晓雷郭鹏
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products