Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Camera-to-lidar calibration and validation

A camera and calibration board technology, applied in the field of camera to LiDAR calibration and verification, can solve problems such as adverse effects of fusion

Pending Publication Date: 2021-06-18
MOTIIONAL AD LLC
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Small deviations in extrinsic parameters may adversely affect fusion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Camera-to-lidar calibration and validation
  • Camera-to-lidar calibration and validation
  • Camera-to-lidar calibration and validation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It may be evident, however, that the invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.

[0051] In the drawings, specific arrangements or sequences of schematic elements, such as those representing devices, modules, instruction blocks and data elements, are shown for ease of description. However, those skilled in the art should understand that the specific ordering or arrangement of schematic elements in the drawings does not imply a specific processing order or sequence, or separation of processing procedures. Furthermore, the inclusion of a schematic element in a drawing does not imply that such element is required in all embodiments, nor does...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An automatic calibration and validation pipeline is disclosed to estimate and evaluate the accuracy of extrinsic parameters of a camera-to-LiDAR coordinate transformation. In an embodiment, an automated and unsupervised calibration procedure is employed where the computed rotational and translational parameters ('extrinsic parameters') of the camera-to-LiDAR coordinate transformation are automatically estimated and validated, and upper bounds on the accuracy of the extrinsic parameters are set. The calibration procedure combines three-dimensional (3D) plane, vector and point correspondences to determine the extrinsic parameters, and the resulting coordinate transformation is validated by analyzing the projection of a filtered point cloud including a validation target in the image space. A single camera image and LiDAR scan (a 'single shot') are used to calibrate and validate the extrinsic parameters. In addition to only requiring a single shot, the complete procedure solely relies on one or more planar calibration targets and simple geometrical validation targets.

Description

technical field [0001] This specification relates generally to vehicle operation, and specifically to camera to LiDAR (Light Detection and Ranging) calibration and validation. Background technique [0002] Many robotic tasks rely on sensor fusion to overcome the shortcomings of individual sensors. Autonomous vehicles operate, for example, by fusing complementary sensory information obtained from on-board sensors. Within the perception pipeline, the dense 2D color, appearance, and texture information perceived by the vehicle's cameras is correlated with the sparse 3D depth and structure information provided by the vehicle's Light Detection and Ranging (LiDAR) system. This enables autonomous vehicles to generate a deeper understanding of their surroundings. [0003] Fusion of multimodal sensor data requires that all sensor information be represented relative to a common coordinate system. Therefore, the exact pose (orientation and translation) of each sensor mounted to the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/80G06K9/46G06K9/62
CPCG06T7/80G06T2207/10028G06T2207/30244G06V10/44G06F18/23G06F18/241G01S17/931G01S17/86G01S17/89G01S7/497G01S7/4808G06T2207/30252G06V20/56G06V10/7635G06F18/2323G06F18/251B60W60/00G01S7/48H04N7/18G06T2207/30248H04N23/57G06F18/22
Inventor P·A·迪德里希斯M·迪·奇科J·S·陈A·J·奥米勒F·A·S·鲁伊斯
Owner MOTIIONAL AD LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products