Unlock instant, AI-driven research and patent intelligence for your innovation.

Image and laser point cloud fused target tracking method, computing device and medium

A technology of target tracking and laser point cloud data, which is applied in the field of object tracking where images and laser point clouds are fused, can solve the problems of consuming computing performance, not being able to directly generate 3D spatial position information, and low accuracy of depth estimation to achieve accurate tracking object effect

Active Publication Date: 2019-11-19
CHANGCHUN YIHANG INTELLIGENT TECH CO LTD
View PDF8 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For 2D space, the image captured by the camera has dense RGB information, but the tracking object in 2D space lacks depth information, so the position information of 3D space cannot be directly generated
In addition, the deep learning method can be used to estimate the depth of the image, but the accuracy of the depth estimation is low, consumes a lot of computing performance, and cannot meet the real-time requirements of automatic driving

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image and laser point cloud fused target tracking method, computing device and medium
  • Image and laser point cloud fused target tracking method, computing device and medium
  • Image and laser point cloud fused target tracking method, computing device and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] In order to enable those skilled in the art to better understand the present invention, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0046] figure 1 Shows the overall flow of a target tracking method for tracking targets around a vehicle based on 2D image data obtained by an image sensor on a vehicle and 3D laser point cloud data obtained by a laser sensor according to an embodiment of the present invention, wherein during the tracking process Generate 3D observations using 2D images.

[0047] The specific scenario is that the vehicle is equipped with a lidar and a camera to obtain 3D laser point cloud and 2D image data. Ideally, at any moment, the 3D laser point cloud data is sufficient, from which the target can be detected to tracked, that is, the observed value can be obtained, and the tracking technology can be used to obtain the predicted value, and the optimal estimate c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method, a computing device and a computer readable medium for tracking a target around a vehicle based on 2D image data and 3D laser point cloud data. The target tracking method comprises the steps of obtaining an optimal estimation of the tracked target at a moment K in a 3D space; obtaining a predicted value of the K + 1 moment in the 3D space by utilizing the optimal estimation of the K moment; judging whether the observation value of the K + 1 moment can be detected from the 3D laser point cloud data of the K + 1 moment or not; if not, projecting the tracking target into a 2D image space by using the optimal estimation of the K moment, and calculating K-time image features of a corresponding target projection area in the 2D image space; and determining a pseudo observation value at the K + 1 moment by utilizing the optimal estimation at the K moment, the predicted value at the K + 1 moment and the image characteristics at the K moment. Theproblem that a tracking object is lost due to the fact that laser point clouds at medium and long distances are sparse and disappear in the tracking process is solved, the object can be accurately tracked in real time, and the method has an outstanding effect on accurately tracking the object in the field of automatic driving.

Description

technical field [0001] The present invention generally relates to the technical field of autonomous driving environment perception, in particular to an object tracking method for fusion of images and laser point clouds. Background technique [0002] Object tracking technology is an important research content in the fields of automatic driving, video surveillance, and human-computer interaction. In the field of autonomous driving, object tracking is to continuously obtain information about various objects (vehicles, pedestrians, bicycles, motorcycles, animals, etc.) around it, and correspond to the same object between different frames. Due to the complex and changeable driving environment, the tracking process usually requires the fusion of data from multiple sensors. Using lidar and cameras is currently a more practical choice. However, for lidar, as the distance increases, the laser points reflected by the object will become very sparse, and there is even no laser point cl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/48G06V20/42G06V20/56G06F18/25
Inventor 董铮李雪范圣印
Owner CHANGCHUN YIHANG INTELLIGENT TECH CO LTD