Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A registration method of event triggered camera and three-dimensional radar

An event-triggered, three-dimensional laser technology, applied in image data processing, instruments, calculations, etc., can solve the problem that there is no better quantification scheme for the evaluation of registration results, the accuracy and error of registration cannot be judged, and the event-triggered camera cannot be obtained. Problems such as feature points can improve noise tolerance, avoid registration failure, avoid edge blur or increase noise

Active Publication Date: 2019-01-15
SUN YAT SEN UNIV
View PDF6 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The disadvantage of the above-mentioned prior art is that there is no method that can use static markers suitable for event-triggered camera internal parameter calibration; the above-mentioned prior art is based on the registration developed by ordinary RGB cameras, and lacks a method suitable for Event-triggered markers for automatic registration of cameras and laser radars; in the above-mentioned prior art, the images and point clouds used for registration are multi-frame, and there is a lack of a method for using single-frame data to achieve registration; in the above-mentioned prior art , there is no better quantitative scheme for the evaluation of registration results, and it is impossible to judge the accuracy and error of registration
[0007] The reason for the above-mentioned disadvantages is that in the above-mentioned prior art, the applicable cameras are all common RGB color cameras, which can image static objects, but the event-triggered camera used in the present invention is suitable for static objects with no change in the ordinary surface light intensity. The object cannot be imaged, so the general static checkerboard calibration map cannot be used; in the above-mentioned prior art, the feature point correspondence method used relies on the ordinary camera for active imaging, and these methods cannot be matched with the point cloud for the event-triggered camera feature points; in the above-mentioned prior art, the feature extraction robustness of the markers used in the lidar point cloud and camera image is low, and it is necessary to extract multi-frame data obtained from different positions to improve the accuracy rate; the above-mentioned prior art However, the conversion results after registration are not fully used for quantitative error statistics

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A registration method of event triggered camera and three-dimensional radar
  • A registration method of event triggered camera and three-dimensional radar
  • A registration method of event triggered camera and three-dimensional radar

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The accompanying drawings are for illustrative purposes only, and should not be construed as limitations on this patent; in order to better illustrate this embodiment, certain components in the accompanying drawings will be omitted, enlarged or reduced, and do not represent the size of the actual product; for those skilled in the art It is understandable that some well-known structures and descriptions thereof may be omitted in the drawings. The positional relationship described in the drawings is for illustrative purposes only, and should not be construed as a limitation on this patent.

[0059] The technical problem mainly solved by the present invention is: aiming at the automatic registration problem of event-triggered camera and three-dimensional laser radar, a kind of image based on event-triggered camera and point cloud of three-dimensional laser radar is proposed, which uses edge extraction, specific pattern recognition and other image Image processing method an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to the technical field of image processing, point cloud processing and sensor data registration, and more particularly, to a registration method of an event triggered camera and a three-dimensional laser radar. 1, designing a calibration object suitable for registration of an event triggering camera and a three-dimensional laser radar; 2, simultaneously starting an event triggering camera and a three-dimensional laser radar to obtain data of the two sensors at the same time; 3, using image processing method such as edge extraction and specific pattern recognitionto locate a point of a calibration object in that image; 4, locating the same point of the calibration object in the point cloud by using the method of point cloud segmentation based on RANSAC; 5, calculating a transformation matrix on six degrees of freedom in the space according to the results obtained in the step 3 and the step 4; Step 6: the registration result is evaluated by the registrationerror and the edge-based value function proposed by the invention.

Description

technical field [0001] The present invention relates to the technical field of image processing, point cloud processing, and sensor data registration, and more specifically, to a registration method between an event-triggered camera and a three-dimensional laser radar. Background technique [0002] Computer vision algorithms based on ordinary RGB cameras have been gradually improved and have been applied in areas such as autonomous driving and object recognition. Lidar collects 3D spatial information of the surrounding environment through laser beams, and plays an irreplaceable and important role in the sensor solution for autonomous driving. [0003] A vision solution with only a camera will be affected by ambient lighting conditions, and a solution with only a lidar is powerless in recognizing plane environmental information such as road signs and traffic lights. Therefore, the fusion of cameras and lidar has become a key idea to solve the problem. Before fusion, what sh...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/33G06T7/80G06T7/155
CPCG06T7/155G06T7/33G06T7/80G06T2207/10044G06T2207/10028
Inventor 黄凯宋日辉江志华李洋灏
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products