A joint calibration method between 360-degree panoramic laser and multiple vision systems

A vision system, joint calibration technology, applied in the field of environmental perception

Active Publication Date: 2017-06-23
DALIAN UNIV OF TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The problem to be solved by the present invention is the automatic joint calibration between the three-dimensional laser ranging system and multiple vision systems, and the external parameter calculation of multiple systems can be conveniently realized through only one data collection, which solves the problem of the laser ranging system and the vision system. The limitation that the system cannot be too far away reduces the requirements of the calibration method on the calibration object and the calibration environment, reduces the influence of the edge effect of the laser point on the calibration results, and enhances the practicability and accuracy of the calibration method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A joint calibration method between 360-degree panoramic laser and multiple vision systems
  • A joint calibration method between 360-degree panoramic laser and multiple vision systems
  • A joint calibration method between 360-degree panoramic laser and multiple vision systems

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] In order to verify the effectiveness of this method, the use of figure 2 The constructed sensor system was used to verify the calibration method. The panoramic laser sensor consists of a Hokuyo UTM-30LX laser sensor and a rotating pan / tilt. The plane scanning angle of the laser sensor is 0-270 degrees, and the frequency range of the stepping motor of the pan / tilt is 500-2500Hz. Use the motor to drive the laser sensor to obtain the three-dimensional laser ranging data of the scene. The four visual systems are all common ANC FULL HD1080P network cameras, using USB2.0 interface, with a viewing angle of 60 degrees and a resolution of 1280×960. The calibration device uses nine 300mm×100mm black papers, which are placed at different positions in the scene.

[0058] The pictures of the calibration device in the scene were collected from the four vision systems respectively (such as image 3 shown), the calibration device can be extracted from the picture using the correspo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a joint calibration method between 360-degree panorama laser and multiple visual systems and belongs to the technical field of robot autonomous environment sensing. The method is characterized in that simple black card paper is used as calibration equipment, and simultaneous calibration on the multiple visual systems and the panorama laser is achieved quickly. Based on the black card paper, laser beams irradiated on the surface of the paper have low reflectivity features, by reflected value graph generating, range filtering, binaryzation and point clustering processing, feature points which belong to a calibration device are extracted from collected three-dimensional laser data, noise objects in an environment are filtered out through a plane extraction method, then an iterative optimization method is used for carrying out solving on three-dimensional feature points based on laser data and two-dimensional feature points based on image data, and accordingly a rotating matrix and a horizontally-moving matrix between sensors are obtained. A foundation is laid for multi-sensor information fusion, and the method can be used in the fields such as moving robot scene reconstruction.

Description

technical field [0001] The invention belongs to the technical field of environment perception, relates to data fusion between a three-dimensional laser ranging system and multiple visual systems, and particularly relates to a joint calibration method between the three-dimensional laser ranging system and multiple visual systems. Background technique [0002] In complex scenes, a single sensor cannot meet the task requirements of environment perception and scene understanding. Therefore, data matching and fusion between multiple sensors is a necessary means to improve the performance of environment perception and scene understanding. The joint calibration between multiple sensors is its key step. At present, there are a variety of external parameter calibration methods for 3D laser and monocular vision. The most common method is to use black and white calibration plates to calibrate the external parameters between the 3D laser and the vision system (Joung J H, AnK H, Kang J W...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/80
CPCG06T7/85
Inventor 闫飞庄严金鑫彤王伟
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products