Calibration method for 3D (three-dimensional) acquisition system

A technology for acquiring systems and calibration methods, applied in the field of system calibration, which can solve problems such as low efficiency, slow speed, and unguaranteed calibration accuracy

Active Publication Date: 2013-10-02
TSINGHUA UNIV
View PDF3 Cites 42 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] If the depth sensor and the position and attitude sensor, as well as the image sensor and the position and attitude sensor are calibrated separately, it is not only slow, inefficient, and inconvenient, but also the image information collected by the image sensor cannot be restored from the depth information collected by the depth sensor. The information is well matched, so it is advantageous to calibrate the external parameters between the depth sensor, image sensor and position and attitude sensor at the same time in one calibration process
[0004] In the process of calibrating the external parameters between the depth sensor, image sensor and position and attitude sensor at the same time, due to the limited resolution of the depth sensor, the point cloud information of the calibration object recovered by each depth sensor collection is basically impossible to contain the same point, if these different points are directly used for calibration, the calibration accuracy cannot be guaranteed; at the same time, due to the limited resolution of the depth sensor, the point cloud information of the calibration object recovered by the depth sensor cannot be well matched with the image sensor. Therefore, the present invention provides a method for calibrating the external parameters between the depth sensor, the image sensor and the position and attitude sensor, which is used to overcome the above difficulties and achieve fast, convenient and accurate The purpose of calibrating the extrinsic parameters between the depth sensor, image sensor and position attitude sensor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Calibration method for 3D (three-dimensional) acquisition system
  • Calibration method for 3D (three-dimensional) acquisition system
  • Calibration method for 3D (three-dimensional) acquisition system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0061] In Embodiment 1, the depth sensor is a two-dimensional laser radar; the image sensor is a color CMOS image sensor or a color CCD image sensor; the position and attitude sensor is an integrated navigation system composed of GPS and IMU.

[0062] Step A: Use the 3D acquisition system to collect N (N ≥ 2) times of calibration objects along different paths, that is, use the depth sensor, image sensor and position and attitude sensor placed on the mobile platform to collect N (N ≥ 2) times of calibration objects along different paths. ≥2) acquisitions; each acquisition records a set of depth information of the calibration object obtained by the depth sensor, records one ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a calibration method for a 3D (three-dimensional) acquisition system and relates to the field of system calibration. The method comprises the following steps of A) performing acquisition for many times along different routes on a calibration object and recording depth information, image information, location information and posture information of every time of the acquisition; B) obtaining external parameters between a calibrated depth sensor and a calibrated location and posture sensor and accurate three-dimensional coordinates of the calibration object by calibrating three-dimensional coordinates in a local coordinate system of the calibration object; and C) obtaining external parameters of a calibrated image sensor and the calibrated location and posture sensor by calibrating two dimensional coordinates of the accurate three-dimensional coordinates of the calibration object in the image information and horizontal coordinates of the calibration object in the image information. According to the calibration method, the external parameters between the depth sensor and the location and posture sensor and the external parameters between the image sensor and the location and posture sensor are calibrated at the same time during a calibration process, and thus an aim for calibrating the 3D acquisition system is fulfilled.

Description

technical field [0001] The invention relates to the field of system calibration, in particular to a calibration method for a 3D acquisition system. Background technique [0002] The 3D acquisition system can measure the 3D point cloud information and image information of the surrounding environment conveniently, quickly and with high precision. Through the collected 3D point cloud information and image information, a 3D color model of the surrounding environment can be established. The obtained 3D color model has a wide range of applications, such as making ordinary maps, 3D maps, surveying and mapping, and urban management. Most of the applications require considerable precision, and the 3D acquisition system is mainly composed of a depth sensor, an image sensor and a position and attitude sensor, thus requiring the described system composed of a depth sensor, an image sensor and a position and attitude sensor There must be considerable precision. The accuracy of the sys...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
Inventor 任仡奕周莹吕俊宏王伟谢翔李国林王志华
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products