An asynchronous on-line calibration method for multi-sensor fusion

A technology of multi-sensor fusion and calibration method, which is applied in the field of asynchronous online calibration of multi-sensor fusion, and can solve the problems of time synchronization and complex process.

Active Publication Date: 2019-03-29
HANGZHOU HUICUI INTELLIGENT TECH CO LTD
View PDF6 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to overcome the shortcomings of the existing calibration methods, which are limited to calibration between two sensors, require strict time synchronization, and have a relatively complicated process, the present invention proposes an

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An asynchronous on-line calibration method for multi-sensor fusion
  • An asynchronous on-line calibration method for multi-sensor fusion
  • An asynchronous on-line calibration method for multi-sensor fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] The present invention will be further described below in conjunction with the accompanying drawings.

[0073] refer to Figure 1 ~ Figure 4 , an asynchronous online calibration method for multi-sensor fusion, comprising the following steps:

[0074] An asynchronous online calibration method for multi-sensor fusion, comprising the following steps:

[0075] 1) Calculate the rotation of the camera

[0076] Let point P be a point in the camera coordinate system, whose coordinates are [x y z] T , set point p 1 ,p 2 are point P in graph F i , F j projection in C i ,C j are the camera coordinate systems at time i and time j respectively, using the pinhole camera model, the scale-independent expression is obtained:

[0077] p 1 =KP

[0078] p 2 =K(RP+t) (1.1)

[0079] where K is the internal reference matrix of the camera, R and t are from C i to C j The rotation matrix and translation vector of are converted to get the epipolar constraints:

[0080]

[0081]...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

An asynchronous on-line calibration method for multi-sensor fusion is disclosed. The mutual conversion relationship between the coordinate systems of the three sensors are calibrated, and the measurement information of the plurality of sensors is unified into the same coordinate system, The traditional calibration method is limited to the calibration between two sensors, and requires strict time synchronization, so the calibration process is more complex. The invention calculates the motion of the laser radar, the camera and the inertial measurement unit (IMU) respectively, obtains the motionof different sensors in the same time interval by using linear interpolation, and finally obtains the external rotation between two sensors by aligning the rotation sequence of the three sensors. Theinvention can obtain more accurate external orientation between the lidar, the camera and the inertial measurement unit (IMU), and provides initial value for subsequent calibration.

Description

technical field [0001] The invention relates to technical fields such as robot vision and multi-sensor fusion, and in particular to an asynchronous online calibration method for multi-sensor fusion. Background technique [0002] The fusion of camera, inertial measurement unit (IMU), and lidar is widely used in the field of robot positioning and mapping to achieve more accurate and robust results than single sensors. A single sensor has limitations. For example, the camera is prone to tracking loss in the case of weak texture and illumination changes. The inertial measurement unit is easy to accumulate errors and offsets. The lidar has point cloud distortion. Combining these three sensors can overcome Disadvantages of a single sensor. [0003] Reliable multi-sensor fusion localization and mapping algorithms rely on accurate extrinsic calibration. By calibrating the mutual conversion relationship between the coordinate systems of the three sensors, the measurement informatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/80G01S7/497G01C25/00
CPCG01C25/005G01S7/497G06T7/80
Inventor 张剑华王曾媛吴佳鑫冯宇婷贵梦萍甘雨陈胜勇
Owner HANGZHOU HUICUI INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products