A multi-camera system and lidar joint system and its joint calibration method

A laser radar and joint calibration technology, applied in the computer field, can solve the problems of calibration, the inability to complete the internal and external parameters of the camera, and the lack of common field of view of the multi-camera system, so as to ensure the accuracy, save the time and process of calibration, and simplify the calibration steps. Effect

Active Publication Date: 2021-06-15
宁波智能装备研究院有限公司
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a multi-camera system and lidar joint system and its joint calibration method to solve the problem that the existing multi-camera calibration requires a common field of view. In a multi-camera system, there are few common fields of view or no field of view and the camera cannot be completed. Calibration of internal and external parameters, and simplify the method of joint calibration of multi-camera system and lidar

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A multi-camera system and lidar joint system and its joint calibration method
  • A multi-camera system and lidar joint system and its joint calibration method
  • A multi-camera system and lidar joint system and its joint calibration method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] Such as Figure 1-3 As shown, the present invention discloses a multi-camera system and laser radar combined system, said combined system includes five sets of industrial cameras 3 and one set of laser radar 2, industrial cameras 3 and laser radar 2 are installed on the fixed bracket 1 , the industrial cameras 3 are installed on the outside of the laser radar 2 in an equicentric distribution with the laser radar 2 as the center.

[0034] Among them, the lidar 2 is a 2.5D lidar or a 3D lidar, and the vertical field of view of the lidar 2 is 10°-40°.

[0035] The present invention also discloses a multi-camera system and lidar joint calibration method, which adopts the above-mentioned joint system, and specifically includes the following steps:

[0036] S1. Install the fixed joint system, turn on the synchronous shooting function of the industrial camera 3, place the special calibration board outside the joint system, adjust the distance between the special calibration b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the field of computer technology, and discloses a joint calibration method of a multi-camera system and a laser radar. By using a specially generated calibration board, as long as two adjacent cameras can capture a part of the calibration board at the same time, multiple The internal and external parameters of the camera, when the cameras of the multi-camera system meet the requirements that two adjacent cameras can capture the calibration board at the same time, the calibration of the multi-camera system can be completed, and then the calibration of the single camera and the laser radar can be combined to complete the calibration of the multi-camera system and laser Calibration of radar; saves the steps of collecting data and selecting point clouds for lidar and each camera, simplifies the calibration steps, saves the time and process of calibration under the premise of ensuring accuracy, and solves the problem of multi-camera systems adjacent to each other The problem that the calibration cannot be completed when the camera has no common field of view or the common field of view is very small.

Description

technical field [0001] The invention relates to the field of computer technology, in particular to a multi-camera system and a lidar joint system and a joint calibration method thereof. Background technique [0002] Most of the information that humans receive during driving comes from vision, such as traffic signs, road signs, traffic signals, etc. These visual information become the main decision-making basis for human drivers to control vehicles. In autonomous driving, the camera replaces the human visual system as one of the sensors for traffic environment perception. Compared with other sensors, the method of installation and use of the camera is simple, the amount of image information obtained is large, the input cost is low, and the scope of action is wide. However, the camera is easily affected by the light, and the shooting effect is poor in places with poor light or sudden changes in light. Another commonly used sensor in autonomous driving is lidar. Lidar works ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G01S7/497G06T7/80
CPCG01S7/497G06T7/80
Inventor 于兴虎王晨宇
Owner 宁波智能装备研究院有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products