Unmanned driving platform real-time target 3D detection method based on camera and laser radar

A lidar and unmanned driving technology, applied in the field of computer vision, can solve the problems of low spatial positioning accuracy of image detection and difficulty in determining the category of point cloud detection, and achieve the effect of accurate spatial positioning and low missed detection rate.

Pending Publication Date: 2020-03-13
NANJING UNIV OF SCI & TECH
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0016] The purpose of the present invention is to provide a real-time 3D detection method for unmanned platforms based on cameras and laser radars applied to unmanned platforms and related autonomous mobile platforms, which overcomes the low positioning accuracy of image det

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned driving platform real-time target 3D detection method based on camera and laser radar
  • Unmanned driving platform real-time target 3D detection method based on camera and laser radar
  • Unmanned driving platform real-time target 3D detection method based on camera and laser radar

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0030] The present invention will be further described in detail below in conjunction with the accompanying drawings.

[0031] Combine figure 1 , The real-time target 3D detection method of unmanned driving platform based on camera and lidar described in this article, through the pixel-level fusion of spatio-temporal synchronization of the raw data of the camera and lidar, using the obtained spatio-temporal synchronization data combined with the lidar The data analysis method is used to obtain the clustering detection results, and the improved faster-rcnn network architecture is constructed to perform parameter training and use for real-time detection. The output type, length, width, height, and center point of the target object around the unmanned driving platform are relative to none. The distance, yaw angle, roll angle, and pitch angle of the space coordinate of the human driving platform. According to the system and its detection method, the present invention adopts tradition...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an unmanned driving platform real-time target detection system based on a camera and a laser radar. Space-time synchronous pixel-level fusion is performed on original data of acamera and a laser radar; a clustering detection result is obtained by combining the obtained space-time synchronization data with a data analysis method of the laser radar; an improved faster-rcnn network architecture is constructed to perform parameter training and real-time detection; and the type, length, width and height of a target object around an unmanned driving platform, the distance ofa central point relative to the space coordinate of the unmanned driving platform, a yaw angle, a roll angle and a pitch angle are output. According to the system and a detection method thereof, a traditional clustering and artificial intelligence fusion algorithm is adopted, the defects that the image detection space positioning precision is low and the category is difficult to judge through point cloud detection are overcome, and a real-time 3D target detection system based on a camera and a laser radar in an unmanned driving platform scene is realized.

Description

technical field [0001] The invention belongs to the field of computer vision, and in particular relates to a real-time 3D detection method for an unmanned driving platform based on a camera and a laser radar. Background technique [0002] Since the development of driverless platform vehicle technology, a model of multi-sensor information fusion, high-precision map and positioning, environmental perception, decision-making and path planning, and vehicle underlying control has basically been formed. Among them, environmental perception is the basis for safe driving of the driverless platform. And the premise, driving strategy formulation, path planning, and vehicle underlying control of driverless platform vehicles all directly depend on highly robust and accurate perception. [0003] Existing systems for perception and detection based on a single sensor are mainly divided into three categories: camera, laser radar, and millimeter-wave radar. The perception and detection syste...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01S17/931
CPCY02A90/10
Inventor 刘雨晨唐兴苏岩
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products