Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional target sensing method in vehicle-mounted edge scene

A three-dimensional target and scene technology, applied in the field of three-dimensional target perception, can solve problems such as poor generalization and easy missing target information, and achieve the effects of long time consumption, reduced point cloud processing time, and improved real-time performance.

Active Publication Date: 2021-10-15
GUANGDONG UNIV OF TECH
View PDF6 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to provide a three-dimensional object perception method in the vehicle edge scene, to solve the problems of poor generalization and easy loss of target information in the existing traditional point cloud perception method, and to improve real-time detection and tracking accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional target sensing method in vehicle-mounted edge scene
  • Three-dimensional target sensing method in vehicle-mounted edge scene
  • Three-dimensional target sensing method in vehicle-mounted edge scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043]The present invention provides a three-dimensional target perception method in a vehicle-mounted edge scene, which realizes the three-dimensional target perception and tracking in a vehicle system by using point cloud projection and two-dimensional image fusion. Under the algorithm optimization of parallel computing, this method performs filtering and segmentation operations on the point cloud image data, then performs point cloud classification and feature value extraction, and then combines the two-dimensional image to project the point cloud onto the two-dimensional image for clustering, and finally Combine the relevant data of the front and back frames to match the information points and connect the targets to achieve the effect of matching and tracking. The method also solves the problem of how to combine the laser radar and the image returned by the camera for target recognition and how to deploy it on a smaller terminal device. After the method of the present inven...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional target sensing method in a vehicle-mounted edge scene, which realizes three-dimensional target sensing and tracking in a vehicle-mounted system by utilizing point cloud projection and two-dimensional image fusion. According to the method, under the optimization of a parallel computing algorithm, filtering and segmentation operations are performed on point cloud image data, then point cloud classification and feature value extraction are performed, a two-dimensional image is combined, the point cloud is projected to the two-dimensional image for clustering, and finally matching of information points and association of targets are performed by combining related data of front and back frames, so that the matching and tracking effects are achieved. According to the method, the problems of how to carry out target identification through the combination of images returned by the laser radar and the camera and how to deploy on a small terminal device are solved at the same time, after the method is applied to the vehicle-mounted device, an accurate identification and tracking effect is achieved, and high generalization and real-time performance are achieved.

Description

technical field [0001] The invention relates to the vehicle-mounted field of intelligent identification and multi-sensor fusion, in particular to a three-dimensional object perception method in a vehicle-mounted edge scene. Background technique [0002] In recent years, with the continuous growth of car ownership, the road carrying capacity of many cities has reached full load, and traffic safety, travel efficiency, energy saving and emission reduction have become increasingly prominent. Intelligent and networked vehicles are usually considered to solve the above traffic problems important way. [0003] With the development of artificial intelligence and computer vision becoming more and more mature, in many practical applications on the perception layer or sensors of the Internet of Vehicles architecture, the demand for visual tasks such as object detection and object tracking has increased dramatically. At the same time, the research on target detection technology of mult...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/292G06T7/143G06T7/80G06T1/20G06K9/00G06K9/32G06K9/62
CPCG06T7/248G06T7/292G06T7/143G06T7/80G06T1/20G06T2207/10016G06T2207/10028G06T2207/20084G06T2207/30252G06F18/23G06F18/24
Inventor 黄泽茵钟卓柔余荣谭北海黄梓欣李贺全芷莹
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products