Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target detection method and target detection system of visual radar spatial and temporal information fusion

A target detection and radar technology, which is used in radio wave measurement systems, measurement devices, and radio wave reflection/re-radiation, etc., can solve problems such as low recognition accuracy, and achieve the effect of comprehensive data and improved accuracy.

Active Publication Date: 2018-01-12
苏州驾驶宝智能科技有限公司
View PDF5 Cites 54 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is that the convolutional neural network in the prior art is trained through RGB-D, the recognition accuracy is low, and the recognition range is low, so as to provide a target detection method and system for fusion of visual radar spatio-temporal information, the recognition distance is long, and the classification High precision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection method and target detection system of visual radar spatial and temporal information fusion
  • Target detection method and target detection system of visual radar spatial and temporal information fusion
  • Target detection method and target detection system of visual radar spatial and temporal information fusion

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment

[0071] Using vision and radar on an unmanned vehicle, a dataset consisting of RGB images and their corresponding depth maps is collected. A color camera is installed on the unmanned vehicle to collect RGB images, and a Velodyne HDL-64E lidar is installed to collect radar 3D point cloud data, and the positions of these two sensors have been calibrated.

[0072] A total of 7,481 RGB images and corresponding radar 3D point clouds were collected. Using the above method, a total of 6,843 (1,750 cars, 1,750 pedestrians, 1,643 trucks, 1,700 bicycles) RGB-LIDAR space-time fusion were produced. pictures and labels. And 5475 pieces of data were used for training and 1368 pieces of data were used for testing to detect the effect of multi-task classification based on the fusion of vision and radar spatio-temporal information.

[0073] use Figure 9 The convolutional neural network shown is used as a model for classification. The model has six convolutional layers and 3 fully connected ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target detection method and a target detection system of visual radar spatial and temporal information fusion. The target detection system comprises an acquisition unit, a sampling unit, a superposition unit, a model building unit and an execution unit. The acquisition unit is used for collecting RGB image data and 3D point cloud data to calculate discretized LIDAR depthmap in grayscale; the sampling unit is used for up-sampling and densifying the LIDAR depth map so that the RGB image and the data form of the LIDAR depth map are unified and corresponded to each otherone by one; the superposition unit is used for combining the RGB image and the LIDAR depth map into an RGB-LIDAR picture and superposing the RGB-LIDAR pictures which are continuously collected for multiple times to obtain a superposed RGB-LIDAR picture, wherein the number of the continuous collection of the RGB-LIDAR pictures is equal to or more than 1; the model building unit is used for establishing an RGB-LIDAR data set for the multiple superposed RGB-LIDAR pictures to enter the deep learning network for training and learning and establish a classification model; the execution unit is usedfor taking corresponding decisions according to target analysis results from the classification model. Consequently, the effects of long-distance recognition and high classification accuracy are achieved.

Description

technical field [0001] The invention relates to a target detection method and system based on the fusion of vision and radar information. The depth information picture is fused according to the radar information and the picture visual information, and the convolutional neural network is used to learn and establish a classification model according to the depth picture. Background technique [0002] At present, some unmanned vehicles have appeared, which can drive automatically without driving, and can replace manual work such as delivery, pick-up, cleaning or measurement. Unmanned vehicles detect roads and obstacles by setting sensors, but they cannot identify various traffic information on the road, such as vehicles and pedestrians, which may easily cause traffic accidents. [0003] In order to solve the above problems, the Chinese patent document CN105975915A discloses a vehicle multi-parameter recognition method based on a multi-task convolutional neural network, the input...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S13/86G06N3/04
Inventor 赵建辉张新钰郭世纯
Owner 苏州驾驶宝智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products