Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

vSLAM implementation method and system based on point and line feature fusion

An implementation method and line feature technology, applied in the field of vSLAM implementation system based on point and line feature fusion, can solve the problems of low loop detection accuracy, no closed loop detection, binocular camera calibration, and high computational complexity of pixel depth.

Inactive Publication Date: 2018-10-19
BEIJING HUAJIE IMI TECH CO LTD
View PDF5 Cites 74 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method has the following shortcomings: 1. The method of using the Kalman filter to optimize the pose has certain limitations. Since the state variables only retain the pose at the current moment, the pose at the past moment will no longer be updated, resulting in inaccurate estimation The prior information of the method is transmitted to the next moment, and it becomes the cumulative error; 2. This method does not realize the function of closed-loop detection, and has certain expansion limitations
This method has the following deficiencies: 1. The DBoW2 dictionary constructed by the loop detection part only describes the feature points and does not describe the feature lines, and the accuracy of the loop detection is low; 2. The calibration of the binocular camera and the calculation of the pixel depth high complexity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • vSLAM implementation method and system based on point and line feature fusion
  • vSLAM implementation method and system based on point and line feature fusion
  • vSLAM implementation method and system based on point and line feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0091] Specific embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings. It should be understood that the specific embodiments described here are only used to illustrate and explain the present invention, and are not intended to limit the present invention.

[0092] Such as figure 1 As shown, the vSLAM implementation method based on point and line feature fusion of the present invention mainly includes a tracking thread, a local map thread, a closed-loop detection thread and a global optimization thread.

[0093] Tracking thread: The input is the image frame sequence collected by the depth camera, which is divided into color image and depth image, and the image at the same moment is called a frame. The image preprocessing part includes image distortion correction, detection and description of feature points and feature line segments, and feature matching. Tracking is divided into two stages, one is the tracking bet...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a vSLAM implementation method and system based on point and line feature fusion. The method includes Step S110, obtaining an image frame sequence of a target scene; Step S120,preprocessing each frame of image; Step S130, according to successfully matched point features and lie features, initializing an environmental map; Step S140, tracking based on the environmental map,and estimating the pose of a current frame of image; Step S150, judging whether the current frame of image meets a key frame condition, if yes, executing Step S160, otherwise repeatedly executing StepS110 to Step S150; Step S160, executing a step of a local map thread; Step S170, executing a step of a closed-loop detection thread; and Step S180, executing a step of a global optimization thread toobtain an optimized environmental map and complete simultaneous localization and mapping. The extraction and matching processes of line features are improved, so as to improve the accuracy of data association in the front end, and thus the defects of vSLAM which exist in a complicated low-texture scene can be effectively overcome.

Description

technical field [0001] The present invention relates to the field of visual synchronous localization and map construction (SLAM), in particular to a vSLAM implementation method based on point and line feature fusion and a vSLAM implementation system based on point and line feature fusion. Background technique [0002] Simultaneous localization and mapping (SLAM) originated in the field of robotics, and its goal is to reconstruct the three-dimensional structure of the environment in real time in an unknown environment and simultaneously position the robot itself. Early SFM technology was generally processed offline. Later, with the development of technology, real-time SFM technology appeared, which can be attributed to the scope of SLAM. V-SLAM technology infers the orientation of the camera in the unknown environment based on the captured video information, and constructs the environment map at the same time. Its basic principle is the principle of multi-view geometry. The ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/33G06T7/38
CPCG06T2207/10016G06T7/33G06T7/38
Inventor 王行周晓军杨淼李朔李骊盛赞
Owner BEIJING HUAJIE IMI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products