Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Kinect-based robot self-positioning method

A robot, self-positioning technology, applied in two-dimensional position/channel control and other directions

Active Publication Date: 2015-11-11
HANGZHOU JIAZHI TECH CO LTD
View PDF5 Cites 139 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the invention is to provide a kind of robot self-positioning method based on Kinect camera for the defect of existing laser positioning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Kinect-based robot self-positioning method
  • Kinect-based robot self-positioning method
  • Kinect-based robot self-positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0047] A kind of robot self-positioning method based on Kinect of the present invention comprises the following steps:

[0048] (1) The Kinect camera is fixedly installed on the robot, and the RGB image information and depth image information of the environment are collected by the Kinect camera;

[0049] (2) Perform three-dimensional restoration of the depth image information to obtain three-dimensional point cloud data: set the focal length of the Kinect depth camera as f x and f y , the optical center is (c x ,c y ), restore the three-dimensional coordinates (X, Y, Z) of any point (x, y) on the depth image by formula (1);

[0050] X = ( x - c x ) * 1 f x ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a Kinect-based robot self-positioning method. The method includes the following steps that: the RGB image and depth image of an environment are acquired through the Kinect, and the relative motion of a robot is estimated through the information of visual fusion and a physical speedometer, and pose tracking can be realized according to the pose of the robot at a last time point; depth information is converted into three-dimensional point cloud, and a ground surface is extracted from the point cloud, and the height and pitch angle of the Kinect relative to the ground surface are automatically calibrated according to the ground surface, so that the three-dimensional point cloud can be projected to the ground surface, and therefore, two-dimensional point cloud similar to laser data can be obtained, and the two-dimensional point cloud is matched with pre-constructed environment raster map, and thus, accumulated errors in a robot tracking process can be corrected, and the pose of the robot can be estimated accurately. According to the Kinect-based robot self-positioning method of the invention, the Kinect is adopted to replace laser to perform positioning, and therefore, cost is low; image and depth information is fused, so that the method can have high precision; and the method is compatible with a laser map, and the mounting height and pose of the Kinect are not required to be calibrated in advance, and therefore, the method is convenient to use, and requirements for autonomous positioning and navigation of the robot can be satisfied.

Description

technical field [0001] The invention relates to a Kinect-based robot self-positioning method, which belongs to the field of robot self-positioning and navigation. Background technique [0002] With the development of computer and robot technology, mobile robots have been rapidly developed and widely used, and have penetrated into almost every field of social life. In the traditional self-positioning method, the self-positioning method based on magnetic strips, magnetic nails and other rails involves the laying of magnetic strips, and the transformation and maintenance of the environment are costly. In the trackless self-positioning method based on lasers, laser sensors are usually very expensive and cannot be used. Realize universal application and promotion. Due to technical and cost constraints, there is currently no low-cost and stable solution for trackless self-positioning of robots. [0003] In recent years, low-cost RGB-D cameras such as Kinect have been increasingl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/02
Inventor 熊蓉毛曙源
Owner HANGZHOU JIAZHI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products