Supercharge Your Innovation With Domain-Expert AI Agents!

Universal visual SLAM (Simultaneous Localization and Mapping) method

A visual and time-consuming technology, applied in directions such as road network navigators to achieve the effect of low computational complexity

Pending Publication Date: 2022-03-01
太原供水设计研究院有限公司 +1
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention solves the problem of improving the accuracy of positioning and map construction in the existing multi-sensor assisted visual SLAM fusion when the positioning of the satellite positioning system fails and there is no odometer assistance, and provides a general SLAM method and structure that can be used for human carrying systems and robot systems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Universal visual SLAM (Simultaneous Localization and Mapping) method
  • Universal visual SLAM (Simultaneous Localization and Mapping) method
  • Universal visual SLAM (Simultaneous Localization and Mapping) method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0134] Step 1: First, use the binocular camera to obtain the image data, perform visual feature acquisition and matching, and construct the visual reprojection error; at the same time, pre-integrate the IMU data of the inertial measurement unit and construct the IMU residual; then the visual reprojection error Combining the two with the IMU residual to optimize the tight coupling of adjacent frames of visual inertia, and obtain the attitude information of the preliminary measurement as the observation state;

[0135] The specific process is:

[0136] Firstly, the angular velocity and acceleration data of the carrier are obtained through the IMU, pre-integrated, and the residual function is constructed through the pre-integrated results; the binocular camera obtains the image data; then feature extraction and matching are performed on the image data, and the visual reprojection error is used to construct Residual function; Jointly construct a tightly coupled optimized residual ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a universal visual SLAM (Simultaneous Localization and Mapping) method, which comprises the following steps of: firstly, carrying out data fusion on a binocular camera and an IMU (Inertial Measurement Unit) in a tight coupling manner to obtain attitude information; secondly, when a satellite positioning system signal is unavailable, using IMU (Inertial Measurement Unit) prediction attitude information as a prediction state; when the signal of the satellite positioning system is available, performing data fusion on the IMU and the satellite positioning system in a loose coupling mode through an extended Kalman filter (EKF) method, and then predicting attitude information as a prediction state; then, the prediction state is updated through the observation state by using the EKF to obtain new attitude information; and finally, constructing a map by using the new attitude information and an image feature depth map generated by the binocular camera, and completing the SLAM algorithm. The IMU and satellite positioning system assisted general vision SLAM method is suitable for being carried on a human backpack system and a robot system without an odometer, can be used indoors and outdoors, and meets the requirements in complex scenes.

Description

technical field [0001] The invention relates to the field of simultaneous localization and map construction. Background technique [0002] SLAM (Simultaneous localization and mapping, SLAM) simultaneous positioning and map construction is a very important part of robot autonomous navigation. It can be used in navigation applications of human carrying systems and robot systems, and is the key to the ultimate realization of fully autonomous mobile robot systems. technology. Visual SLAM technology uses camera vision sensors to collect environmental image information for target positioning and recognition; due to the rich information contained in images, visual SLAM technology has become an integral part of various SLAM applications. [0003] However, with the increase in the complexity of application scenarios, the use of a single visual sensor in complex scenarios will be subject to many restrictions, and multi-sensor technology fusion is required to solve the SLAM problem. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/32
CPCG01C21/32
Inventor 付世沫常青王耀力
Owner 太原供水设计研究院有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More