Visual SLAM method based on multi-feature fusion

A multi-feature fusion and vision technology, applied in image data processing, instrumentation, 3D modeling, etc., can solve problems such as the decrease of constraint force based on point features, the inability to perform stable tracking on point features, and the inability to construct environmental maps, etc.

Inactive Publication Date: 2019-07-26
HARBIN UNIV OF SCI & TECH
View PDF5 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to solve the problems that the point features calculated by the existing point feature-based visual SLAM algorithm in a low-texture environment cannot be tracked stably, and the binding force generated according to the point features is reduced, and an accurate environment map cannot be constructed. The invention provides a multi-feature fusion visual SLAM method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual SLAM method based on multi-feature fusion
  • Visual SLAM method based on multi-feature fusion
  • Visual SLAM method based on multi-feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0087] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0088] The technical solution adopted in the present invention is: a visual SLAM method based on multi-feature fusion, the specific implementation process is shown in figure 1 , mainly including the following steps:

[0089] Step 1, carry out the calibration of the depth camera, and determine the internal reference of the camera;

[0090] Step 2, for the video stream data acquired by the camera on the mobile robot platform, Gaussian filtering is performed to reduce the influence of noise on subsequent steps;

[0091] Step 3, perform feature extraction on the corrected image, extract point features and line features of each frame online, specifically include: extract FAST corner feature for each frame image, use rBRIEF descriptor to describe, and use LSD algorithm for line features Perform detection and use the LBD descriptor to describe, and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a visual SLAM method based on multi-feature fusion, and relates to the field of robot visual positioning and mapping. The invention discloses a multi-feature fusion vision SLAM(simultaneous localization and mapping) method based on a depth camera, which solves the vision positioning problem under the condition that pure point features fail by fully using point-line features extracted from an image and constructing plane features according to the point-line features. A self-adaptive threshold method is adopted to extract point features, so that more uniform point features can be obtained; line features are extracted, short and small line segments are deleted, and the segmented line segments are combined to improve the accuracy of line feature matching; wherein the point-line features are used for estimating the inter-frame pose and constructing a local map; a surface feature is calculated by adopting a minimum parameter method to reduce the calculated amount; the point, line and plane features are tightly coupled by constructing a back projection error function of the fusion features, and a global map is constructed to carry out global pose optimization. Thevisual SLAM method is high in precision, good in real-time performance and high in robustness, and the problem that the visual SLAM precision is reduced or even a system fails in a low-texture environment based on a feature point method is solved.

Description

[0001] Technical field: [0002] The invention belongs to the technical field of robot simultaneous positioning and map construction, and in particular relates to a multi-feature fusion SLAM (simultaneous positioning and map construction) method. [0003] Background technique: [0004] With the development of visual SLAM technology, frame-based optimization and graph optimization frameworks have become the mainstream framework for visual SLAM problems. The graph optimization framework introduces motion estimation and beam adjustment into visual SLAM. Motion estimation is the robot The location and surrounding environment features are treated as a global optimization problem. By extracting the features on the image for feature tracking, an error function is established, and a linear optimization is constructed through a linear assumption or a nonlinear optimization is directly performed to solve the problem when the error function obtains the minimum value. Robot pose simultaneo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/80G06T5/50G06T5/00G06K9/62G06T17/00
CPCG06T7/80G06T7/246G06T5/002G06T5/50G06T17/00G06F18/22
Inventor 尤波梁强
Owner HARBIN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products