Unlock instant, AI-driven research and patent intelligence for your innovation.

Laser vision strong coupling SLAM method based on adaptive factor graph

An adaptive factor and strong coupling technology, which is applied in the re-radiation of electromagnetic waves, measuring devices, instruments, etc., can solve the problem of low utilization of feature information, insufficient integration of vision system and radar system, and inability to dynamically adjust multi-sensor data fusion Methods and other issues to achieve the effect of improving accuracy, stability, and strong robustness

Pending Publication Date: 2022-02-08
HARBIN ENG UNIV +1
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The invention solves the problems that the combination of the vision system and the radar system is not close enough to dynamically adjust the fusion mode of multi-sensor data and the utilization degree of feature information is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Laser vision strong coupling SLAM method based on adaptive factor graph
  • Laser vision strong coupling SLAM method based on adaptive factor graph
  • Laser vision strong coupling SLAM method based on adaptive factor graph

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0065] Embodiment one, refer to figure 1 This embodiment will be described. A laser vision strong coupling SLAM method based on an adaptive factor graph described in this embodiment includes:

[0066] Lidar module, monocular camera module, IMU module, laser loop detection unit, factor graph optimizer unit, monocular loop detection unit and map storage module;

[0067] The lidar module includes a laser odometer unit and a scene detection unit;

[0068] The monocular camera module includes a scene detection unit and reprojection error calculation;

[0069] The IMU module includes an IMU pre-integration unit;

[0070] Described definition four kinds of robot scenes, and according to the lidar of collection and monocular camera data, judge the scene where current robot is in by scene detection module;

[0071] The data received by the IMU is preprocessed by the scene where the robot is located, and the relative pose of the robot between the two frames of lidar data is calculat...

Embodiment 2

[0076] Embodiment two, see figure 1 This embodiment will be described. This embodiment is a further limitation of the laser vision strong coupling SLAM method based on the adaptive factor graph described in Embodiment 1. In this embodiment, the scene detection is performed according to the collected lidar and monocular camera data. The module judges the current scene of the robot, including:

[0077] Convert the collected 3D lidar data into a height map;

[0078] Project the 3D point cloud on the X-Y plane grid map. For the case where there are multiple points in the same grid, the point with the highest set Z value is reserved as the value of the grid, and the retained grid points are fitted by the RANSAC method a quadratic curve;

[0079] Delete rasters that do not contain projected points;

[0080] Calculate the space area S enclosed by the fitted quadratic curve:

[0081] If the space area S is smaller than the threshold, it is determined that the robot is in an indoo...

Embodiment 3

[0083] Embodiment three, refer to figure 2 and image 3 This embodiment will be described. This embodiment is a further limitation of the laser vision strong coupling SLAM method based on an adaptive factor graph described in Embodiment 2. In this embodiment, when the robot is in an outdoor scene, determine whether there is enough structure in the scene Information to ensure the stability of feature point extraction:

[0084] Point cloud data for object segmentation processing;

[0085] Any two adjacent laser point cloud data are printed on the same object surface, then the tangent direction of the line segment formed by the two points should point to the laser radar, and the included angle β:

[0086]

[0087] Among them, point A and point B are two adjacent laser point clouds, and point O is the laser radar;

[0088] If the included angle β is greater than the set threshold, the two adjacent laser point cloud data are on the same object;

[0089] If the included ang...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a laser vision strong coupling SLAM method based on an adaptive factor graph, and relates to the field of laser vision strong coupling. At present, a visual system and a radar system are not tightly combined, the fusion mode of multi-sensor data cannot be dynamically adjusted, and the utilization degree of feature information is low. The method provided by the invention comprises the following steps: defining four robot scenes, and judging the scene of the current robot through a scene detection module according to collected data of a laser radar and a monocular camera; preprocessing the data received by the IMU through the scene where the robot is located, and calculaing the relative pose of the robot between the two frames of laser radar data; enabling the laser speedometer module to uses different inter-frame matching modes according to a scene where the robot is located to obtain a robot pose between two frames; and enabling the monocular camera module to collect feature point information and carry out re-projection error calculation. The method is suitable for autonomous positioning and environmental perception of the robot in an indoor and outdoor fusion scene without GPS information.

Description

technical field [0001] The invention relates to the field of laser vision strong coupling, in particular to the field of a laser vision strong coupling SLAM method based on an adaptive factor graph. Background technique [0002] The existing SLAM method based on the fusion of lidar and IMU has high accuracy and long distance in the distance information collected by the lidar sensor. Combined with the IMU to provide the rotation information of the robot, the positioning is accurate and the quality of the map is high in outdoor scenes. However, lidar sensors do not perform well in some degraded scenes, such as grasslands, sports fields and other open outdoor scenes that lack geometric features, cannot accurately extract feature points, resulting in inaccurate positioning. Moreover, in smaller scenes such as indoors, the points collected by the lidar may be concentrated on some objects that are close to the lidar, resulting in the inability to completely collect the information...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/00G01C21/16G01S17/86G01S17/89G01S17/08G06K9/62
CPCG01C21/3841G01C21/1652G01S17/86G01S17/89G01S17/08G06F18/23G06F18/241G06F18/251Y02T10/40
Inventor 王桐王承端刘佳豪高山汪畅
Owner HARBIN ENG UNIV