Monocular perception correction method and device based on sparse point cloud and storage medium

A sparse point, monocular technology, applied in the field of monocular perception correction method, device and storage medium based on sparse point cloud, can solve the problems of high algorithm complexity and large deviation of target depth estimation, and achieves low algorithm complexity and improved Accuracy, improve the effect of detection effect

Active Publication Date: 2020-08-25
NINGBO GEELY AUTOMOBILE RES & DEV CO LTD +1
View PDF9 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a monocular perception method based on sparse point cloud, which is used to solve the problems of high algorithm complexity and large target depth estimation deviation when correcting the depth information of the monocular camera in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular perception correction method and device based on sparse point cloud and storage medium
  • Monocular perception correction method and device based on sparse point cloud and storage medium
  • Monocular perception correction method and device based on sparse point cloud and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] In order to enable those skilled in the art to better understand the solutions of the present invention, the following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is an embodiment of a part of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0031] Intelligent perception is an important part of automatic driving, and the measurement of obstacle distance is one of the basic tasks of 3D perception. The 3D object detection method based on sparse point cloud will lose a lot of contour or detail information due to the sparsity of point cloud. It is poor, and the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a monocular perception correction method and device based on sparse point cloud, and a storage medium. The method comprises the steps of collecting the original camera data ofa monocular camera and the original sparse point cloud data of a radar sensor; processing the original camera data to obtain a three-dimensional detection result of a plurality of targets in an imageplane, the three-dimensional detection result comprising a target depth value and a two-dimensional bounding box; obtaining a conversion matrix; mapping the original sparse point cloud data to a corresponding position of an image plane based on the conversion matrix to obtain a point cloud projection depth map, setting a point cloud frame for each two-dimensional boundary frame in the point cloudprojection depth map, the point cloud projection depth map comprising a plurality of projection points corresponding to the original sparse point cloud data, and each projection point comprising a point cloud depth value; and correcting target depth values of a plurality of targets based on the point cloud depth values of the projection points contained in all the point cloud frames. The method isadvantaged in that by designing the point cloud frame characteristics, accuracy of correcting the target depth value is improved.

Description

technical field [0001] The present invention relates to the field of automatic driving, in particular to a monocular perception correction method, device and storage medium based on sparse point clouds. Background technique [0002] Intelligent perception is an important part of autonomous driving and the link between the vehicle and the environment. At present, mainstream sensing sensors include cameras, millimeter-wave radar, and lidar, etc., but multi-beam lidar is very expensive and not suitable for mass production, while the point clouds obtained by low-wire beam lidar and millimeter-wave radar are very sparse and not suitable for direct use. Do three-dimensional obstacle perception. Compared with binocular cameras, monocular cameras are relatively cheap sensors, and have unique advantages in obstacle detection and tracking. However, monocular cameras have limitations in depth information perception. [0003] In the prior art, the scheme of fusion of monocular camera ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G08G1/04G06T7/50
CPCG08G1/04G06T7/50G06T2207/10028
Inventor 严鑫
Owner NINGBO GEELY AUTOMOBILE RES & DEV CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products