Binocular inertial simultaneous localization and mapping method based on point-line feature fusion

A technology of feature fusion and map construction, which is applied in image enhancement, image analysis, 3D modeling, etc. It can solve the problem that adjacent similar line segments and long line segments are easily divided into multiple short line segments, multiple outliers, line segment repetition detection, etc. problem, to achieve the effect of optimizing the extraction quality is not high, high-precision pose estimation, and reducing the false matching rate

Active Publication Date: 2021-05-14
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Most of the traditional methods use the LSD (Line Segment Detector) algorithm as a tool for line feature detection, but the original intention of the LSD algorithm is to characterize the structural features of the scene, the extraction speed is slow, and the LSD algorithm without parameter adjustment is in the face of complex backgrounds. In the case of noisy images, it is easy to detect too many short-segment features, which not only wastes the computing resources of line segment detection, description and matching, but also tends to generate more outliers, resulting in a significant drop in positioning accuracy.
In addition, the LSD algorithm usually has the problem of line segment duplication detection and over-segmentation, that is, there are too many adjacent similar line segments and long line segments are easily divided into multiple short line segments, which complicates the subsequent line segment matching task, thereby increasing the SLAM system. uncertainty

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular inertial simultaneous localization and mapping method based on point-line feature fusion
  • Binocular inertial simultaneous localization and mapping method based on point-line feature fusion
  • Binocular inertial simultaneous localization and mapping method based on point-line feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0155] The simulation experiment is carried out using public datasets, where the datasets are the most difficult subsequences MH_04_difficult, MH_05_difficult, V1_03_difficult, V2_03_difficult to extract from the four scenes in the EuRoC dataset [31].

[0156] In step S1, the point feature and line feature are detected and tracked. In the point feature detection and track, the corner points are extracted as feature points, and the KLT optical flow method is used to realize the feature point tracking, and the large difference is eliminated based on the reverse optical flow tracking. point.

[0157] In line feature detection and tracking, line features are extracted through the following steps:

[0158] S101, image grayscale: For the color image input by the sensor, the weighted average is converted into a grayscale image according to the sampling values ​​of each channel of the image;

[0159]S102. Noise reduction: use a Gaussian filter to filter out noise and smooth the image...

experiment example 1

[0221] The time-consuming and line feature extraction quantity of each frame in Example 1, Comparative Example 1, and Comparative Example 2 were counted, and the results are shown in Table 1.

[0222] Table I

[0223]

[0224] The machine hall extraction effect under the EuRoC data set is as follows Figure 4~6 shown, where Figure 4 Extract the effect for Comparative Example 1, Figure 5 Extract the effect for Comparative Example 2, Image 6 Extract effect for embodiment 1, from table one and Figure 4~6 It can be seen that the number of line segments extracted in Comparative Example 1 is the largest, followed by Comparative Example 2, and the least in Example 1. There are a large number of short line segment features in Comparative Example 1 and Comparative Example 2, which increases the calculation cost of line segment detection and matching, and some The division of long line segments into many short line segments and the existence of adjacent line segments complica...

experiment example 2

[0227] Statistical embodiment 1, comparative example 3, the motion estimation error in comparative example 4, adopt the accuracy of absolute track error evaluation algorithm, promptly calculate the root mean square error (root mean square error) of the Euclidean distance between estimated pose and true pose , RMSE) and the maximum value. When comparing with the real trajectory, the EVO (evaluation of odometry and SLAM) tool is used for data alignment and error calculation, where the root mean square error is the final error obtained by considering both translation and rotation.

[0228] The results are shown in Table II.

[0229] Table II

[0230]

[0231] It can be seen from Table 2 that due to the high quality of line feature extraction in Example 1, the positioning accuracy is better than that of Comparative File 3 and Comparative File 4. Except for the two simple sequences of MH_01_easy and MH_02_easy, the accuracy of the two simple sequences is slightly lower than tha...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a binocular inertial simultaneous localization and mapping method based on point-line feature fusion. Length suppression, near-line merging and broken-line splicing methods are added after online feature extraction. Point, line and IMU data information are effectively fused in the sliding window based on optimization, the problem that a point feature visual SLAM algorithm in a weak texture environment is low in precision and unstable is solved, higher-precision pose estimation is achieved, length suppression, near-line merging and broken-line splicing strategies are utilized, while rapid extraction is guaranteed, the problem that the line segment extraction quality of a traditional algorithm is not high is solved, the mismatching rate of system line features is reduced, and the method can better adapt to indoor weak texture and texture-free scenes.

Description

technical field [0001] The invention relates to a simultaneous simultaneous positioning and map construction method, in particular to a binocular inertial simultaneous positioning and map construction method based on point-line feature fusion, and belongs to the technical field of robot control. Background technique [0002] Simultaneous Localization and Mapping (SLAM) is considered to be the core technology to realize the autonomous operation of mobile robots, and has been widely used in fields such as drones, unmanned vehicles, and virtual reality. For indoor environments, since buildings will block GPS signals, drone positioning mostly uses SLAM technology, and in order to overcome the lack of accuracy of a single sensor itself, a multi-sensor fusion strategy is often used, such as visual inertia is an effective fusion method , and both the camera and the inertial measurement unit (Inertial Measurement Unit, IMU) have the characteristics of light weight and low cost, whic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/05G06T3/40G06T5/50G06T7/13G06T7/181G06T7/246G06T7/269G06T7/73
CPCG06T17/05G06T5/50G06T3/4038G06T7/13G06T7/181G06T7/73G06T7/246G06T7/269G06T2207/20164G06T2207/20192G06T2200/08G06T2200/32G06T2207/20221
Inventor 赵良玉金瑞朱叶青
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products