Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Structured scene vision SLAM (Simultaneous Localization and Mapping) method based on point-line-surface features

A surface feature, structured technology, applied in the field of structured scene vision, can solve the problems of lack of texture, low pose estimation accuracy, poor robustness, etc., to reduce the cumulative drift error between frames, improve the tracking effect and accuracy. Effect

Pending Publication Date: 2022-08-05
SOUTH CHINA UNIV OF TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above-mentioned problems existing in the prior art, the purpose of the present invention is to provide a structured scene visual SLAM method based on point-line-surface features, to solve the problem of the current visual SLAM technology in artificial structured indoor scenes with lack of texture and changing illumination. The problem of low accuracy and poor robustness of pose estimation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Structured scene vision SLAM (Simultaneous Localization and Mapping) method based on point-line-surface features
  • Structured scene vision SLAM (Simultaneous Localization and Mapping) method based on point-line-surface features
  • Structured scene vision SLAM (Simultaneous Localization and Mapping) method based on point-line-surface features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0117] A structured scene vision SLAM method based on point, line and surface features, including the following steps:

[0118] S1, input a color image, extract point features and line features according to the color image and perform feature matching; wherein in the present embodiment, the EDLine algorithm is used to extract the line feature, and the LSD algorithm can also be used to extract the line feature, and the split line segment extracted by the LSD algorithm is used. The merging method is also based on the endpoint distance, angle and descriptor information as the screening criteria;

[0119] S2, input the depth image, convert it into a point cloud sequence structure, then extract the image plane, and then match the extracted image plane with the map surface; in this embodiment, the plane matching is based on the size of the normal angle and the origin of the world coordinate system. The difference between the distance to the plane is used as the criterion for judging...

Embodiment 3

[0124] A structured scene vision SLAM method based on point, line and surface features, including the following steps:

[0125] S1, input color image, extract point feature and line feature according to color image and carry out feature matching; Wherein in the present embodiment, use LSD algorithm to extract line feature, and the merging method of the split line segment extracted by LSD algorithm is also according to endpoint distance, Angle and descriptor information as screening criteria;

[0126] S2, input the depth image, convert it into a point cloud sequence structure, then extract the image plane, and then match the extracted image plane with the map surface; in this embodiment, the plane matching is based on the size of the normal angle and the origin of the world coordinate system. The difference between the distance to the plane is used as the criterion for judging plane matching, and the angle between the normal and whether there is a collision area between the two...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a structured scene vision SLAM (Simultaneous Localization and Mapping) method based on point-line-surface features, which comprises the following steps of: firstly, inputting a color image and a corresponding depth image, extracting the point-line-surface features in the image and carrying out feature matching; then detecting a Manhattan world coordinate system according to a plane normal vector, if the Manhattan world coordinate system exists and appears in a Manhattan world map, calculating a camera attitude and tracking point-line-plane feature estimation displacement, otherwise, tracking the point-line-plane feature estimation pose; performing key frame judgment on the current frame, and if the current frame is a key frame, inserting the key frame into a local map; maintaining map information and performing joint optimization on the current key frame, the adjacent key frames and the three-dimensional features; and finally, loopback detection is carried out, and if a closed-loop frame is detected, loopback is closed and global optimization is carried out. The method is a visual SLAM method with high precision and strong robustness, and solves the problem that the visual SLAM precision is reduced and even the system is invalid only based on point features in a low-texture structured scene.

Description

technical field [0001] The invention belongs to the technical field of simultaneous positioning and map construction of robots, and specifically designs a structured scene visual SLAM method based on point, line and surface features. Background technique [0002] In recent years, the application of intelligent vehicle positioning systems in urban traffic has become more and more extensive, and traditional positioning methods such as outdoor positioning systems based on GNSS technology are now very mature. However, high-performance real-time positioning is still a challenge in indoor environments where GNSS signals are obscured. Although indoor positioning systems based on wireless signals have been developed by leaps and bounds, such as those based on Bluetooth, WiFi, UWB, and RFID, they are difficult to be effectively used indoors due to the high cost of equipment deployment, and they are susceptible to occlusion and multipath interference indoors. Location and mapping. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06T7/00
CPCG06T7/73G06T7/97G06T2207/10024G06T2207/10028Y02T10/40
Inventor 裴海龙翁卓荣
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products