Pose Estimation Method Based on Multi-eye Vision Measurement and Laser Point Cloud Map Matching

A multi-eye vision, laser point cloud technology, applied in the field of robot navigation, can solve the problems of far from the real value, dependence, cannot be recovered, etc., to achieve the effect of practical recovery of point cloud, overcoming dependence, and high precision

Active Publication Date: 2022-06-21
ZHEJIANG UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional binocular camera recovery point cloud method relies on the baselines of the two cameras for triangulation. However, when using a wide-angle camera, it can only recover point clouds in the middle and short distances. Far points have no parallax and cannot be recovered; when using a telephoto camera , can restore distant points, but the accuracy of the restoration and its dependence on the external parameters calibrated between the two cameras, small differences in external parameters may cause the restored distant points to be far from its true value

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pose Estimation Method Based on Multi-eye Vision Measurement and Laser Point Cloud Map Matching
  • Pose Estimation Method Based on Multi-eye Vision Measurement and Laser Point Cloud Map Matching
  • Pose Estimation Method Based on Multi-eye Vision Measurement and Laser Point Cloud Map Matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments.

[0034] The components of the embodiments of the invention generally described and illustrated in the drawings herein may be arranged and designed in a variety of different configurations. Thus, the following detailed description of the embodiments of the invention provided in the accompanying drawings are not intended to limit the scope of the invention as claimed, but are merely representative of selected embodiments of the invention. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without creative work fall within the protection scope of the present invention.

[0035] Hereinaf...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a pose estimation method based on multi-eye vision measurement and laser point cloud map matching, referred to as a multi-eye laser pose estimation method. The number is at least three, and each camera shares the view with at least two other cameras. When three or more cameras are placed on the same straight line, the straight line must intersect with the approximate plane of the environment space restored by two of the cameras. . Obtain pictures with common view area, reconstruct the approximate plane, restore the error distribution information of the environment space before and after the approximate plane, calculate the real distance from the approximate plane to the camera, and obtain the multi-eye vision measurement model. For the laser point cloud map, marginalize the overfitting model, redistribute the points of the marginalized model, and generate a continuous function model of the point cloud. Based on the surface scanning principle, the multi-eye vision measurement model and the continuous function model of the point cloud are aligned and registered to realize pose estimation.

Description

technical field [0001] The invention relates to the field of robot navigation, in particular to a pose estimation method based on multi-eye vision measurement and laser point cloud map matching. Background technique [0002] The point cloud recovered by the traditional monocular vision method has the problem of unknown scale, which greatly restricts the practicability of this method. The traditional method for restoring point clouds of binocular cameras relies on the baselines of the two cameras for triangulation. However, when using a wide-angle camera, only the point clouds at the medium and close distances can be restored, and the distant points have no parallax and cannot be restored; when using a telephoto camera , the distant points can be recovered, but the recovery accuracy and its dependence on the extrinsic parameters calibrated between the two cameras may cause the recovered distant points to be far from their true values. Therefore, in order to overcome the disa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/73G01C21/00G01C21/20G01C11/00
CPCG06T7/73G01C21/005G01C21/20G01C11/00G06T2207/10028
Inventor 张宇万泽宇
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products