Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vehicle target joint cognition method and system based on point cloud and image data

A technology of image data and point cloud data, applied in the field of vehicle target joint cognition method and system, can solve the problems of unmanned driving and low precision, and achieve high flexibility, generalization ability and strong generalization ability Effect

Active Publication Date: 2019-07-12
武汉环宇智行科技有限公司
View PDF6 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The existing technology can combine the detection of optical images and three-dimensional point cloud data, and match the image features and three-dimensional point cloud features through the convolutional neural network, so as to obtain the outer rectangular frame of the target, which is used for the detection of the target position. However, the existing methods are only suitable for the recognition and detection of car targets, not suitable for pedestrians and bicycles, or do not fully combine 3D point cloud data, so the accuracy is low and cannot be directly applied to unmanned driving

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vehicle target joint cognition method and system based on point cloud and image data
  • Vehicle target joint cognition method and system based on point cloud and image data
  • Vehicle target joint cognition method and system based on point cloud and image data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the embodiments of the present invention. Obviously, the described embodiments are only part of the embodiments of the present invention, not all of them. Based on the implementation manners in the present invention, all other implementation manners obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of the present invention.

[0029] Such as figure 1 As shown, the vehicle target joint cognition method based on point cloud and impact data of the present invention comprises the following steps:

[0030] Step 1. Obtain the 3D point cloud data of the lidar and the plane image of the image sensor, divide the 3D point cloud data into grids, obtain multiple voxels of the same size, and calculate the mass points of the 3D point cloud in each voxel;

[0031] Step 2. According to...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a vehicle target joint cognition method and system based on point cloud and image data. The system comprises a data level combination module, a deep learning target detection wood block and a combination cognition module. The data level combination module obtains three-dimensional point cloud data and image data, and fuses the point cloud data and the image data. Fusion datais collected by the deep learning target detection module to be subjected to feature level detection and recognition, a detection result is output. The joint cognition module judges the feature levelfusion detection result and the data level fusion detection result by adopting an evidence theory method, reliability distribution is obtained to serve as output, and detection stability and robustness are improved.

Description

technical field [0001] The invention relates to the field of unmanned driving technology, in particular to a method and system for joint recognition of vehicle targets based on point cloud and image data. Background technique [0002] An unmanned vehicle is a vehicle with autonomous driving behavior. On the basis of traditional vehicles, artificial intelligence modules such as environmental perception, intelligent decision-making, path planning, and behavior control are added, and then it can interact with the surrounding environment and rent out corresponding functions. Decision-making and action of mobile wheeled robots. [0003] Thanks to the rapid development of new sensor technology and its learning technology, a variety of sensors are used in unmanned driving to perform complete, accurate, robust and real-time perception of the surrounding environment. The process of environment perception mainly includes sensor calibration, structured road detection, unstructured roa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/56G06N3/045G06F18/24G06F18/253
Inventor 李明曹晶石强谢兴
Owner 武汉环宇智行科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products