Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

SLAM-based visual perception mapping algorithm and mobile robot

A visual perception and robotics technology, applied in the field of visual perception and mobile robots, can solve problems such as inability to make large-scale map construction, inaccurate pose estimation, and insufficient functions

Pending Publication Date: 2020-01-17
GUANGDONG UNIV OF TECH
View PDF4 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Some improved algorithms or positioning algorithms for SLAM have been proposed in the prior art, but the functions are not perfect. For example, in the actual robot and space environment, the scheme cannot make a large-scale map construction, and there is a large drift. The estimation is not accurate enough. At the same time, the existing scheme needs to complete the map construction in an environment that is completely static objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • SLAM-based visual perception mapping algorithm and mobile robot
  • SLAM-based visual perception mapping algorithm and mobile robot
  • SLAM-based visual perception mapping algorithm and mobile robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] Part I, Algorithms

[0058] The invention discloses a visual perception mapping algorithm based on SLAM. Firstly, the image data collected by the depth camera carried on the robot and the rotational speed data collected by the encoder installed at the driving wheel of the robot are data fused; wherein, the Kinect2 used The .0 camera is a depth camera, and the collected image data is divided into two parts: a color image and a depth image. The two parts establish a corresponding relationship through a unified time stamp, and extract the image features of the color image; the rotational speed data is calculated and processed according to the kinematic equation , fused and optimized together with the depth image, by minimizing the encoder and re-projection errors, the pose of the robot at a certain moment can be obtained, and the pose information is output; the current frame obtained when the robot is walking in the area to be mapped Screening of image data, obtaining key ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an SLAM-based visual perception mapping algorithm and a mobile robot. The algorithm comprises the steps of obtaining image data of a to-be-mapped area environment and rotatingspeed data of a driving wheel; performing feature extraction on a color image in the image data; after the rotating speed data is calculated and processed according to a kinematics equation, fusing and optimizing the rotating speed data and a depth image in the image data by using a method of minimizing an encoder and a re-projection error to obtain a current frame pose of the robot; screening current frame image data acquired when a robot walks in a to-be-mapped area to obtain a key frame, identifying and classifying objects in an environment by utilizing a color image of the key frame, deleting the classified dynamic objects from the key frame, carrying out sparse processing on static objects, and carrying out local loopback for correction; taking pixel points of the color image and thecorresponding depth image in the key frame as map points, storing the map points through an octree, and carrying out mapping.

Description

technical field [0001] The invention relates to the fields of visual perception and mobile robots, in particular to a SLAM-based visual perception mapping algorithm and a mobile robot. Background technique [0002] As an important branch of artificial intelligence, computer vision has gradually moved from theory to practical production application, and is widely used in high-tech products such as face recognition systems, driverless cars, and drones. The SLAM technology has greatly promoted the development of robot visual perception technology, enabling robots to use their eyes to explore the surrounding environment and capture information like humans, thereby "knowing" their own environment. [0003] Some improved algorithms or positioning algorithms for SLAM have been proposed in the prior art, but the functions are not perfect. For example, in the actual robot and space environment, the scheme cannot make a large-scale map construction, and there is a large drift. The es...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20G06T7/70G06T7/80
CPCG06T7/20G06T7/70G06T7/80
Inventor 李丰蔡述庭陈文峰徐伟锋
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products