Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-camera vision SLAM method based on observability optimization

A multi-camera vision and camera technology, applied in image analysis, instrumentation, computing, etc., can solve the problem of weak restoration of the real world scale, and achieve the effect of improving feature matching results, improving accuracy and reliability, and robust real-time positioning

Active Publication Date: 2021-08-13
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the existing multi-camera visual SLAM methods only use the panoramic perception ability of the multi-camera vision system to track more map points based on the traditional visual SLAM algorithm framework, and introduce baseline constraints in pose estimation and map point construction, The characteristics of the multi-camera vision system in the visual SLAM problem are not fully analyzed and utilized, and the ability to restore the real world scale is weak

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-camera vision SLAM method based on observability optimization
  • Multi-camera vision SLAM method based on observability optimization
  • Multi-camera vision SLAM method based on observability optimization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0059] The accompanying drawings provided in this implementation case can only illustrate the basic idea of ​​the present invention, and only show the components related to this implementation case, and are not drawn according to the number and size of components during actual implementation. The shape and quantity of each component in the actual implementation process Can be changed at will.

[0060] This implementation case provides a multi-camera visual SLAM method based on observability optimization. The hardware structure diagram used in this method is as follows figure 1 As shown, the figure shows a multi-camera vision system composed of 6 cameras, and the baseline length between adjacent cameras is about 45cm; figure 2 It is a schematic diagram of the installation and arrangement of the multi-camera vision system on the carrier. The horizontal field of view of each camera is about 120°. According to the requirements of the multi-camera vision system in the present inve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-camera vision SLAM method based on observability optimization, which is characterized in that a multi-camera vision system is designed according to the observability of multi-camera vision SLAM, so that the scale of positioning information can be more effectively recovered when the multi-camera vision system is used for a ground unmanned platform of an Ackerman motion model; according to the method, the baseline constraint and panoramic perception capability of a multi-camera visual system are fully utilized, optimization is carried out according to observability, and the autonomous navigation problem of a ground unmanned platform in an unknown environment is solved; the problems that an existing visual SLAM method is weak in scale recovery capability, and robust positioning cannot be carried out in an environment with sparse texture features or uneven distribution are solved; the real-time positioning effect change trend can be directly evaluated according to the observability of observability design, the visual angle difference needed between key frames is judged on the basis, the reliability of visual SLAM is improved, and a ground unmanned platform can accurately and reliably carry out autonomous navigation under the real world scale.

Description

technical field [0001] The invention belongs to the field of computer vision positioning, and relates to an indirect multi-camera visual SLAM based on visual SLAM observability improvement. Background technique [0002] With the continuous development of technology and the improvement of people's needs for work and life, unmanned driving technology has become a current hot spot. Researchers hope that unmanned driving technology will change the vehicle control method and achieve safer, more efficient driving in urban traffic, logistics and other scenarios. Efficient vehicle controls. Now, unmanned vehicles can realize all-weather centimeter-level autonomous navigation when high-precision maps are used as prior information. When high-precision maps are not available, the integrated navigation scheme using satellite navigation systems and inertial navigation systems can also be used in open environments. Realize high-precision positioning. However, the cost of surveying and m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/80G06T17/05
CPCG06T7/85G06T17/05Y02T10/40
Inventor 杨毅唐笛梁浩王俊博潘淼鑫王涛
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products