Robot RGB-D SLAM method based on geometric and motion constraints in dynamic environment

A motion-constrained, dynamic-environment technology, applied in instruments, surveying and navigation, image analysis, etc., can solve problems such as camera pose deviation, enlarged visual SLAM positioning deviation, map point calculation errors, etc.

Active Publication Date: 2021-02-19
杭州宇芯机器人科技有限公司
View PDF7 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For the visual SLAM system of sparse maps, the challenge of dynamic scenes mainly comes from the inability of traditional visual SLAM to distinguish motion feature points from static feature points. When calculating the camera pose, the motion feature points are mistakenly calculated as static feature points, resulting in There is a huge deviation between the calculated camera pose and the actual pose, which then leads to incorrect calculation of map points, further expanding the positioning deviation of visual SLAM

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot RGB-D SLAM method based on geometric and motion constraints in dynamic environment
  • Robot RGB-D SLAM method based on geometric and motion constraints in dynamic environment
  • Robot RGB-D SLAM method based on geometric and motion constraints in dynamic environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] The present invention will be further described below in conjunction with the accompanying drawings.

[0071] refer to Figure 1 to Figure 5 , a robot RGB-D SLAM method based on geometric and motion constraints in a dynamic environment. In an indoor dynamic environment, the method includes the following steps:

[0072] Step 1: Perform camera internal reference (camera principal point, focal length and distortion coefficient) calibration, the process is as follows:

[0073] Step 1.1: Use the camera to obtain multiple fixed-size checkerboard image data under different viewing angles;

[0074] Step 1.2: Use Zhang Zhengyou’s camera calibration method to calculate the internal parameters of the camera on the obtained checkerboard image data, and obtain the camera calibration result, which is recorded as K;

[0075] Step 2: Acquire the image frames in the video stream sequentially. First, build an image pyramid for the acquired image frames, and then divide the image into b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a robot RGB-D SLAM method based on geometric and motion constraints in a dynamic environment. The robot RGB-D SLAM method comprises the steps that image data are acquired through an RGB-D camera, wherein the image data comprise an RGB image and a depth image; the coordinates of feature points in a camera coordinate system are acquired through an ORB feature point detection algorithm, the RGB image and the depth image; the pose of a camera is calculated by using a g2o optimization algorithm; the feature points in the camera coordinate system are converted into a world coordinate system, and the coordinates of the same point in the world coordinate system are continuously tracked to obtain the multiple observation speeds of the point; whether the point is on a real moving object or a static object is known by analyzing the law of multiple observation speeds; after analyzing that the point is a real motion, the motion speed of the moving point and the coordinates under the world coordinate system are filtered through a Kalman filtering algorithm; and finally the moving point and the static point are added into g2o pose optimization. The method is effectively suitable for a dynamic environment, and a more accurate camera pose is obtained.

Description

technical field [0001] The invention relates to a robot positioning method in an indoor dynamic environment. Background technique [0002] In the related research on intelligent navigation technology of autonomous mobile robots, the simultaneous localization and map construction (SLAM, SLAM, for short) technology of robots in unknown environments is a key technology, which has both application and academic value. It has become a research hotspot in this field in the past two decades. Under this trend, scholars have proposed a variety of methods to solve the SLAM problem, and also applied a variety of sensors to solve the environmental perception problem in SLAM. [0003] As an inexpensive sensor with a large amount of information, the camera has excellent application prospects in the field of autonomous mobile robots. Visual SLAM mainly solves the pose transformation of the camera by matching the same feature points in multiple frames of images. However, there are still f...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G06T7/80
CPCG01C21/206G06T7/80G06T2207/30252
Inventor 艾青林刘刚江
Owner 杭州宇芯机器人科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products