Quick and precise calibrating method of mapping relation of laser point cloud and visual image

A visual image and mapping relationship technology, applied in the field of intelligent networked vehicle environment perception, can solve the problems of cumbersome calibration process, global optimization of difficult calibration process, 3D point cloud and pixel error accumulation, etc., to simplify the calibration process and reduce calibration Steps, effects with high mapping accuracy

Active Publication Date: 2018-06-22
TSINGHUA UNIV
View PDF7 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, by calibrating all physical parameters, and then obtaining the mapping relationship between 3D point clouds and pixels, errors will accumulate, and it is difficult to obtain the global optimum of the calibration process, and multiple calibrations of different parameters will also cause cumbersome calibration processes.
Therefore, the calibration process complexity and calibration accuracy of existing calibration methods need to be improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Quick and precise calibrating method of mapping relation of laser point cloud and visual image
  • Quick and precise calibrating method of mapping relation of laser point cloud and visual image
  • Quick and precise calibrating method of mapping relation of laser point cloud and visual image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The present invention will be described in detail below in conjunction with the accompanying drawings and embodiments. However, it should be understood that the accompanying drawings are provided only for better understanding of the present invention, and they should not be construed as limiting the present invention.

[0048] Suppose there is a space point x in the world coordinate system world , which is a three-dimensional space point x in the lidar coordinate system lidar =(x l ,y l ,z l ) T ; and its coordinate in the camera coordinate system is x camera =(xc ,y c ,z c ) T , becomes a two-dimensional point u in the pixel coordinate system projected by the camera camera =(u,v) T . The so-called calibration is to establish x lidar with u camera The corresponding relationship between, that is, to give the representation of a certain space point in the world coordinate system in the lidar coordinate system x lidar , the corresponding u of the point can be...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a quick and precise calibrating method of the mapping relation of a laser point cloud and a visual image. The method comprises the following steps of 1, setting a checkerboardcalibrating plate with square holes, placing the calibrating plate into the fields of a laser radar and a camera simultaneously and obtaining n corresponding characteristic points by extracting characteristic points of the laser point cloud and the visual image; 2, conducting homography matrix initial calculation; 3, estimating homography matrix maximum likelihood; 4, estimating camera distortionparameter maximum likelihood; 5, estimating maximum likelihood of all mapping parameters in the mapping relation of the laser point cloud and the visual image. Based on a homography matrix, the direct mapping relation among the three-dimensional point cloud and visual image pixels is directly built, and a camera internal parameter matrix and a sensor external parameter matrix do not need to be calibrated. By means of the calibrating method, the calibrating steps are reduced; meanwhile, since the mapping result is directly optimized, transferring of a calibrating error cannot be caused, and the calibrating precision is higher.

Description

technical field [0001] The invention relates to a fast and accurate calibration method for the mapping relationship between a laser point cloud and a visual image, and belongs to the field of environment perception of intelligent networked vehicles. Background technique [0002] Lidar can directly measure the distance information of the surrounding environment, has accurate measurement accuracy and a long measurement range, especially the multi-line Lidar has ideal three-dimensional modeling capabilities. However, because it cannot obtain rich color information, it is still difficult to use 3D point clouds to semantically understand the surrounding environment. The camera can obtain rich color information of the surrounding environment, and the current semantic segmentation algorithm for images is relatively mature. However, due to the loss of depth information in the visual image, it is difficult to accurately express the three-dimensional size of the surrounding environme...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/80
CPCG06T7/80G06T2207/10004G06T2207/10028
Inventor 杨殿阁谢诗超江昆钟元鑫肖中阳曹重王思佳
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products