Multi-Camera/Lidar/IMU-based multi-sensor SLAM method

A multi-sensor, multi-camera technology, applied in the field of multi-sensor SLAM, can solve the problem that the sensor cannot meet the complex outdoor environmental scenes, and achieve the effect of accurate lidar frame pose and improved robustness

Active Publication Date: 2020-11-24
HANGZHOU GUANGPO INTELLIGENT TECH CO LTD
View PDF6 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the fact that the current single sensor cannot meet the complex outdoor environmental scenes, the present invention proposes a multi-sensor SLAM method based on Multi-Cameral / Lidar / IMU

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-Camera/Lidar/IMU-based multi-sensor SLAM method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The present invention will be described in detail below in conjunction with the specific embodiments shown in the accompanying drawings, but these embodiments do not limit the present invention, those of ordinary skill in the art make structural, method, or functional changes based on these embodiments All are included in the scope of protection of the present invention.

[0058] Such as figure 1 As shown, a multi-sensor SLAM method based on Multi-Camera / Lidar / IMU includes the following steps:

[0059] S1: The multiple image data obtained by the multi-eye camera and the data obtained by the IMU inertial measurement unit are tightly coupled and jointly initialized to obtain the initial pose of the system;

[0060] Obtaining the system initialization pose includes the following steps:

[0061] S11: Hard synchronous triggering of multi-camera via Raspberry Pi;

[0062] S12: Obtain multi-eye image data and IMU data from the multi-eye camera, and align the multi-eye image...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a Multi-Camera/Lidar/IMU-based multi-sensor SLAM method, which comprises the following steps: performing tight coupling joint initialization on data of a plurality of images obtained by a multi-view camera and data obtained by an IMU inertial measurement unit to obtain an initial pose of a system; acquiring point cloud data of a laser radar frame by a laser radar sensor, preprocessing the point cloud data, and dividing the point cloud into a strong angular point, a weak angular point, a strong plane point and a weak plane point; optimizing the pose of the laser radar frame through the initial pose of the system; optimizing the pose of the laser radar frame further through closed-loop detection; and carrying out map splicing by using the optimized laser radar framepose. According to the invention, the data obtained by the three sensors are fused, the obtained laser radar frame pose is more accurate, and the robustness and stability of the positioning system areeffectively improved.

Description

technical field [0001] The invention relates to technical fields such as computer vision and robots, and in particular to a multi-sensor SLAM method based on Multi-Camera / Lidar / IMU. Background technique [0002] In recent years, the unmanned driving industry has gradually entered the public eye. SLAM can enable mobile vehicles to locate in an environment where GPS cannot work normally, and track and identify dynamic vehicles and pedestrians to provide possibilities for intelligent obstacle avoidance, assisted driving and autonomous navigation. [0003] When positioning and framing, SLAM mainly uses sensors to obtain raw data, and is accompanied by the assistance of IMU. In the case of existing developments, SLAM has two implementation forms: one is laser SLAM based on lidar; the other is visual SLAM based on camera. Laser SLAM is currently mainly used in indoor and smaller environments. Visual SLAM is more extensive and more adaptable to outdoor environments, but it is hi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01S17/931G01S17/894G01S17/86G01C21/16G01C22/00
CPCG01S17/931G01S17/894G01S17/86G01C21/165G01C22/00Y02T10/40
Inventor 杨建松余小欢陈嵩白云峰张合勇
Owner HANGZHOU GUANGPO INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products