Unlock instant, AI-driven research and patent intelligence for your innovation.

Monocular vision-assisted laser radar odometer method based on ground plane

A monocular vision, auxiliary laser technology, applied in the field of robot recognition, can solve the problems of high computational complexity, poor system stability, large system overhead, etc., to achieve the effect of ensuring real-time performance, improving accuracy, and avoiding performance

Active Publication Date: 2021-03-12
SHANGHAI UNIVERSITY OF ELECTRIC POWER
View PDF10 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, simultaneous odometry and mapping based on two sensors requires a lot of system overhead, and is not suitable for low-computing occasions such as embedded platforms
[0005] Therefore, it is necessary to study how to allocate a small amount of system resources to use visual sensors to assist lidar in simultaneous positioning and mapping, to make up for the shortcomings of lidar in terms of self-motion distortion and loop closure detection, and to avoid the high computational complexity and poor system stability of existing methods. issues are of great importance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular vision-assisted laser radar odometer method based on ground plane
  • Monocular vision-assisted laser radar odometer method based on ground plane
  • Monocular vision-assisted laser radar odometer method based on ground plane

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments. This embodiment is carried out on the premise of the technical solution of the present invention, and detailed implementation and specific operation process are given, but the protection scope of the present invention is not limited to the following embodiments.

[0044] This embodiment provides a monocular vision-assisted lidar odometer method based on the ground plane, which is an odometer scheme tightly coupled with monocular vision and lidar, and uses the ground point cloud in the laser odometer to efficiently extract images. Based on the ground feature points of the homography transformation, the efficient absolute scale camera motion estimation is realized, and then the motion estimation is used to correct the ego-motion point cloud distortion and pose optimization in the laser odometry. Aiming at the wrong matching problem caused by the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a monocular vision-assisted laser radar odometer method based on a ground plane. The method comprises the steps of efficiently extracting ground feature points in an image through the ground point cloud in a laser odometer, achieving efficient absolute-scale camera motion estimation based on homography transformation, and then using motion estimation for correcting self-motion point cloud distortion and pose optimization in a laser odometer. Compared with the prior art, the tight coupling scheme provided by the invention efficiently utilizes the monocular vision imageand the ground plane information in the laser radar point cloud, and avoids the problems that the existing vision and laser radar fusion algorithm is high in calculation complexity, and the system precision and stability are influenced by wrong depth matching.

Description

technical field [0001] The invention relates to the technical field of robot identification, in particular to a ground plane-based monocular vision-assisted laser radar odometer method. Background technique [0002] Simultaneous localization and mapping (SLAM, simultaneous localization and mapping) is a key technology in the field of robotics and a basic requirement for robots to operate autonomously. Lidar and vision sensors are two mainstream sensors in SLAM. In recent years, SLAM algorithms based on these two sensors have been widely researched and applied. In terms of visual SLAM, excellent solutions represented by ORB-SLAM2, DSO, VINS, etc. have emerged. In laser SLAM, frameworks such as LOAM, IMLS-SLAM, and SegMatch have been formed. However, a single sensor always has certain deficiencies, such as vision sensors are sensitive to environmental textures and lighting conditions, while lidar has point cloud distortion problems caused by body motion and insufficient loop...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C22/00G01S17/86G01S17/89
CPCG01C22/00G01S17/86G01S17/89Y02T10/40
Inventor 彭道刚戚尔江晏小彬王丹豪欧阳海林王永坤高义民潘俊臻
Owner SHANGHAI UNIVERSITY OF ELECTRIC POWER
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More