Unlock instant, AI-driven research and patent intelligence for your innovation.

Target tracking method, computing device and medium for image and laser point cloud fusion

A technology of target tracking and laser point cloud data, applied in the field of object tracking of image and laser point cloud fusion, can solve the problems of consuming computing performance, low accuracy of depth estimation, unable to directly generate 3D spatial position information, etc., to achieve accurate tracking object effect

Active Publication Date: 2022-03-11
CHANGCHUN YIHANG INTELLIGENT TECH CO LTD
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For 2D space, the image captured by the camera has dense RGB information, but the tracking object in 2D space lacks depth information, so the position information of 3D space cannot be directly generated
In addition, the deep learning method can be used to estimate the depth of the image, but the accuracy of the depth estimation is low, consumes a lot of computing performance, and cannot meet the real-time requirements of automatic driving

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method, computing device and medium for image and laser point cloud fusion
  • Target tracking method, computing device and medium for image and laser point cloud fusion
  • Target tracking method, computing device and medium for image and laser point cloud fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] In order to enable those skilled in the art to better understand the present invention, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0046] figure 1 Shows the overall flow of a target tracking method for tracking targets around a vehicle based on 2D image data obtained by an image sensor on a vehicle and 3D laser point cloud data obtained by a laser sensor according to an embodiment of the present invention, wherein during the tracking process Generate 3D observations using 2D images.

[0047] The specific scenario is that the vehicle is equipped with a lidar and a camera to obtain 3D laser point cloud and 2D image data. Ideally, at any moment, the 3D laser point cloud data is sufficient, from which the target can be detected to tracked, that is, the observed value can be obtained, and the tracking technology can be used to obtain the predicted value, and the optimal estimate c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A target tracking method, a computing device, and a computer-readable medium for tracking targets around a vehicle based on 2D image data and 3D laser point cloud data. The target tracking method includes: obtaining an optimal estimate of K time of the tracked target in 3D space; Use the optimal estimate at time K to obtain the predicted value at time K+1 in 3D space; judge whether the observed value at time K+1 can be detected from the 3D laser point cloud data at time K+1; when it is judged not possible, use The optimal estimation at K time, project the tracking target into the 2D image space, and calculate the K time image features of the corresponding target projection area in the 2D image space; use the optimal estimation at K time, the predicted value at K+1 time and the K time Image features, determine the pseudo-observed value at K+1 time. It solves the problem of losing the tracking object due to the sparseness and disappearance of the laser point cloud in the middle and long distance during the tracking process, and can accurately track the object in real time, which has a prominent role in the accurate tracking of the object in the field of automatic driving.

Description

technical field [0001] The present invention generally relates to the technical field of autonomous driving environment perception, in particular to an object tracking method for fusion of images and laser point clouds. Background technique [0002] Object tracking technology is an important research content in the fields of automatic driving, video surveillance, and human-computer interaction. In the field of autonomous driving, object tracking is to continuously obtain information about various objects (vehicles, pedestrians, bicycles, motorcycles, animals, etc.) around it, and correspond to the same object between different frames. Due to the complex and changeable driving environment, the tracking process usually requires the fusion of data from multiple sensors. Using lidar and cameras is currently a more practical choice. However, for lidar, as the distance increases, the laser points reflected by the object will become very sparse, and there is even no laser point cl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V20/56G06V20/40G06V10/80G06K9/62
CPCG06V20/48G06V20/42G06V20/56G06F18/25
Inventor 董铮李雪范圣印
Owner CHANGCHUN YIHANG INTELLIGENT TECH CO LTD