Panoramic vision SLAM method based on multi-camera cooperation

A panoramic vision and multi-camera technology, applied in computer components, image data processing, 3D modeling, etc., can solve problems such as limited perception field of view, poor ability to resist light and occlusion, poor positioning accuracy in weak texture environments, etc., to achieve Efficient and robust work, the effect of improving operating efficiency

Active Publication Date: 2019-07-30
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF5 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of this, the purpose of the present invention is to provide a panoramic visual SLAM method based on multi-camera collaboration, which can solve the problems of the existing pure visual SLAM system, such as limited perceptual field of view, poor positioning accuracy in weak texture environments, lack of map scale information, and anti- Problems such as poor lighting and occlusion capabilities provide strong technical support for autonomous navigation of unmanned platforms in unstructured environments

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Panoramic vision SLAM method based on multi-camera cooperation
  • Panoramic vision SLAM method based on multi-camera cooperation
  • Panoramic vision SLAM method based on multi-camera cooperation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment example 1

[0074] This implementation case provides a panoramic vision SLAM based on multi-camera collaboration. figure 1 As shown, the figure shows the structure of the SLAM system with 5 cameras; figure 2 It is a schematic diagram of the distribution of a multi-camera system. The given horizontal field of view of the camera is 120 degrees, and β=90. According to the calculation formula of the number of cameras in the present invention, the number of cameras N≥4.5, and N is 5, then the adjacent camera clamp The angle is 72 degrees; according to image 3 Schematic diagram of the parameter relationship and derived formula under the non-parallel camera structure. When the perceived environment range is given as 5-10m, the baseline length can be set to 30cm. Using these parameters to lay out the camera can more effectively and accurately perceive the environment for more accurate acquisition. The results of the positioning and composition of the SLAM system.

[0075] Such as Figure 4 As shown...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an SLAM method based on multi-camera cooperation. 360-degree observation of a detection environment is realized through a plurality of cameras which supplement visual fields mutually; each camera independently acquires and analyzes data; meanwhile, in the processes of map construction and motion tracking, the map is tracked; the effective camera collaboratively completes the tasks of generation of shared map points with scale information and joint pose estimation; map construction and high-precision positioning tasks of the panoramic SLAM system can be completed throughcooperation of imaging and structural characteristics of multiple cameras. The problem that the perception visual field of the existing pure visual SLAM system is limited, the positioning precision of a weak texture environment is poor; the map scale information lacks, illumination resistance and shielding capability are weak and the like are solved, meanwhile, a plurality of cameras are independent of one another, the normal work of the system cannot be influenced when any camera goes wrong, the system has certain robustness on the conditions of object shielding, direct sunlight, lens damage and the like, and a technical guarantee is provided for autonomous navigation of the unmanned platform.

Description

Technical field [0001] The invention belongs to the technical field of computer vision positioning, and specifically relates to a panoramic vision SLAM method based on multi-camera collaboration. Background technique [0002] With the rapid development of artificial intelligence technology, research in fields such as unmanned driving, virtual reality, and face recognition has become a current hot spot. Among them, unmanned driving has achieved lane-level positioning effects in some urban environments with known maps, but driving on unstructured roads with unknown environments, and cannot use signal source positioning sensors such as GPS, Beidou, Galileo, etc. Under the circumstances, how to achieve autonomous map construction and precise positioning is one of the difficulties in this field. [0003] Simultaneous Localization and Mapping (SLAM) refers to the unmanned platform using unmanned sensors such as cameras, Lidar, odometer, inertial sensors and GPS and Beidou with signal so...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/05G06T7/246G06T7/50G06T7/80G06K9/00G06K9/46
CPCG06T17/05G06T7/246G06T7/50G06T7/80G06T2207/10028G06T2207/20084G06V20/10G06V10/44G06V10/462
Inventor 杨毅王冬生唐笛邓汉秋王美玲付梦印
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products