Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot positioning and mapping method and device based on depth image

A robot positioning and depth image technology, applied in the computer field, can solve the problem of easy failure of visual SLAM relocation, and achieve the effect of improving speed, efficiency, efficiency and accuracy

Active Publication Date: 2020-03-06
HEFEI UNIV OF TECH
View PDF3 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

PTAM meets the real-time requirements of visual SLAM, but in the process of implementing this application, the inventors found that there are at least the following problems in the prior art: in larger scenarios, the use of PTAM to achieve visual SLAM relocation is prone to failure. In the scene, how to accurately locate and map the robot has become an urgent problem to be solved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot positioning and mapping method and device based on depth image
  • Robot positioning and mapping method and device based on depth image
  • Robot positioning and mapping method and device based on depth image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the technical field of this application; the terms used herein in the specification of the application are for the purpose of describing specific embodiments only It is not intended to limit the application; the terms "comprising" and "having" and any variations thereof in the description and claims of this application and the above description of the drawings are intended to cover non-exclusive inclusion. The terms "first", "second" and the like in the description and claims of the present application or the above drawings are used to distinguish different objects, rather than to describe a specific order.

[0058] Reference herein to an "embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the present application. T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a robot positioning and mapping method and device based on a depth image, a computer device and a storage medium. The method comprises the following steps: using an RGB-D camera for carrying out surrounding environment detection; acquiring an RGB image and a depth image; based on an RGB image and a depth image, determining continuous image frames, calculating continuous image frames by adopting a sparse direct method; obtaining an initial pose of the current frame; few calculations are carried out; determining initial pose, increasing the pose acquisition speed; meanwhile, calculating and optimizing the initial pose of the current frame by adopting a feature point method; obtaining an accurate pose of the current frame; the illumination change or the fast movementis ensured; according to the accuracy of pose estimation, performing key frame selection according to the accurate pose of the current frame to obtain a key frame sequence, performing local mapping and optimization based on the key frame sequence to generate an environment map. The environment map is generated efficiently and accurately, and the efficiency and accuracy of robot positioning and mapping are improved.

Description

technical field [0001] The present application relates to the field of computer technology, and in particular, to a robot positioning and mapping method, device, computer equipment and storage medium. Background technique [0002] In recent years, technologies such as unmanned vehicles, robots, drones, and AR / VR have developed rapidly. At the same time, positioning and map construction have also become hot research issues and are considered to be key basic technologies in these fields. This is because in an unknown environment, the accurate positioning of the robot requires an accurate environment map, and to construct an accurate environment map, the robot also needs to know its exact location in the environment. The SLAM (Simultaneous Localization and Mapping) technology enables robots and other carriers to start from unknown locations in an unknown environment, and use a series of sensors (lidar, GPS, IMU, cameras, etc.) they carry to observe the characteristics of the en...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06T17/05
CPCG06T17/05G06V20/10G06V10/44G06V10/56Y02T10/40
Inventor 方宝富王浩杨静韩健英韩修萌卢德玖
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products