Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular perception correction method, device and storage medium based on sparse point cloud

A sparse point, monocular technology, applied to devices and storage media, in the field of monocular perception correction methods based on sparse point clouds, can solve the problems of high algorithm complexity and large deviation of target depth estimation, and achieves low algorithm complexity and improved performance. effect, the effect of reducing estimation bias

Active Publication Date: 2022-07-12
NINGBO GEELY AUTOMOBILE RES & DEV CO LTD +1
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a monocular perception method based on sparse point cloud, which is used to solve the problems of high algorithm complexity and large target depth estimation deviation when correcting the depth information of the monocular camera in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular perception correction method, device and storage medium based on sparse point cloud
  • Monocular perception correction method, device and storage medium based on sparse point cloud
  • Monocular perception correction method, device and storage medium based on sparse point cloud

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] In order to make those skilled in the art better understand the solutions of the present invention, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only Embodiments are part of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.

[0031] Intelligent perception is an important part of automatic driving, and the measurement of obstacle distance is one of the basic tasks of 3D perception. The 3D target detection method based on sparse point cloud will lose a lot of outline or detail information due to the sparseness of point cloud, and the detection effect It is relatively ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a monocular perception correction method, device and storage medium based on sparse point cloud. The method includes: collecting original camera data of a monocular camera and original sparse point cloud data of a radar sensor; processing the original camera data , obtain the 3D detection results of multiple targets in the image plane, the 3D detection results include the target depth value and the 2D bounding box; obtain the transformation matrix; based on the transformation matrix, map the original sparse point cloud data to the corresponding position of the image plane, and obtain Point cloud projection depth map, set a point cloud frame for each two-dimensional bounding box in the point cloud projection depth map, the point cloud projection depth map includes multiple projection points corresponding to the original sparse point cloud data, each projection point Including point cloud depth values; based on the point cloud depth values ​​of the projected points contained in all point cloud frames, the target depth values ​​of multiple targets are corrected. The invention improves the accuracy of correcting the target depth value by designing the point cloud frame characteristics.

Description

technical field [0001] The invention relates to the field of automatic driving, in particular to a method, device and storage medium for monocular perception correction based on sparse point cloud. Background technique [0002] Intelligent perception is an important part of autonomous driving and the link between the vehicle and the environment. At present, the mainstream perception sensors include cameras, millimeter-wave radars, and lidars, etc., but multi-beam lidars are very expensive and not suitable for mass production, and the point clouds obtained by low-beam lidars and millimeter-wave radars are very sparse and not suitable for direct use. Do 3D obstacle perception. Compared with binocular cameras, monocular cameras are relatively cheap sensors and have unique advantages in obstacle detection and tracking. However, monocular cameras have limitations in depth information perception. [0003] In the prior art, the scheme of fusion of monocular camera data and sparse...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G08G1/04G06T7/50
CPCG08G1/04G06T7/50G06T2207/10028
Inventor 严鑫
Owner NINGBO GEELY AUTOMOBILE RES & DEV CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products