Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for object recognition and registration based on event triggered camera and three-dimensional laser radar fusion system

An event-triggered, three-dimensional laser technology, applied in image data processing, instruments, computing, etc., can solve the problems of cloud sparse, unable to recognize the target, reduce the accuracy of target recognition, etc., to achieve small data redundancy, distance positioning information Stable and reliable, cost-effective effect

Active Publication Date: 2019-01-04
SUN YAT SEN UNIV
View PDF7 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Because the point cloud data obtained by the lidar, the farther away from the lidar, the sparser the point cloud will be, which will lead to the inability to recognize distant targets, thereby reducing the accuracy of the final target recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for object recognition and registration based on event triggered camera and three-dimensional laser radar fusion system
  • A method for object recognition and registration based on event triggered camera and three-dimensional laser radar fusion system
  • A method for object recognition and registration based on event triggered camera and three-dimensional laser radar fusion system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The accompanying drawings are for illustrative purposes only, and should not be construed as limitations on this patent; in order to better illustrate this embodiment, certain components in the accompanying drawings will be omitted, enlarged or reduced, and do not represent the size of the actual product; for those skilled in the art It is understandable that some well-known structures and descriptions thereof may be omitted in the drawings. The positional relationship described in the drawings is for illustrative purposes only, and should not be construed as a limitation on this patent.

[0053] Such as figure 1 As shown, the present invention is based on an event-triggered camera and a three-dimensional lidar fusion system, which can accurately classify and locate objects in real time.

[0054] Step 1. The event triggers the camera and the 3D lidar to collect data respectively.

[0055] Step 2. The data fusion of the two sensors is performed through the internal cam...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to the technical field of depth learning, image processing and three-dimensional point cloud processing, and more particularly, to an object recognition and registrationmethod under an event-triggered camera and a three-dimensional laser radar fusion system. The invention provides a system for fusing the data of an event-triggered camera and the data of a lidar. Universal object detection is carried out by YOLO3 depth learning neural network. The minimum filter is used to fuse the depth information of lidar to detect the object and its depth information in real-time and accurately.

Description

technical field [0001] The present invention relates to the technical fields of deep learning, image processing and three-dimensional point cloud processing, and more specifically, relates to an object recognition and registration method based on an event-triggered camera and a three-dimensional laser radar fusion system. Background technique [0002] Both traditional RGB cameras and lidar are commonly used sensors on autonomous vehicles. However, the RGB camera cannot accurately obtain depth information, and the point cloud data obtained by the lidar will become sparse as the distance increases. Therefore, it is also very important to compensate for the shortcomings of each other by fusing these two sensors into one system. In terms of identifying and classifying objects, the fused system generally uses the method of finding an area of ​​interest from three-dimensional lidar data, performing object identification and classification on another sensor data corresponding to t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/30G06T7/80
CPCG06T7/30G06T7/80G06T2207/20104
Inventor 黄凯宋日辉李洋灏江志华
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products