Unlock instant, AI-driven research and patent intelligence for your innovation.

3D target detection method and device, electronic equipment and medium

A target detection and 3D technology, applied in the field of target detection, can solve the problem of low 3D target detection accuracy, and achieve the effect of improving loss calculation accuracy, improving accuracy and robustness

Pending Publication Date: 2022-05-06
INSPUR SUZHOU INTELLIGENT TECH CO LTD
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current algorithm 3D target detection accuracy is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D target detection method and device, electronic equipment and medium
  • 3D target detection method and device, electronic equipment and medium
  • 3D target detection method and device, electronic equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0049] This application provides a 3D object detection method, referring to figure 1 ,include:

[0050]S1. Acquiring image data and laser point cloud data in the scene.

[0051] S2. According to the image data, perform feature processing to generate bird's-eye view features.

[0052] S3. Perform feature processing according to the laser point cloud data to generate laser point cloud features.

[0053] S4. Perform feature fusion of the bird's-eye view feature and the laser point cloud feature to obtain the fused feature.

[0054] S5. According to the fused features, time-series feature extraction is performed through a time-series neural network, and feature decoding is performed to obtain a 3D target frame.

[0055] S6. Perform loss calculation on the 3D target frame, where the loss calculation at least includes overlap loss calculation.

[0056] Specifically, the above steps are mainly based on the overall architecture of the 3D target detection method for autonomous driv...

Embodiment 2

[0080] Corresponding to the above-mentioned embodiments, the present application also provides a 3D object detection device, referring to Figure 4 , the device includes: a data acquisition module, an image processing module, a laser point cloud processing module, a feature fusion module, a time series module and a loss calculation module.

[0081] Among them, the data acquisition module is used to acquire image data and laser point cloud data in the scene; the image processing module is used to perform feature processing according to the image data to generate bird's-eye view features; the laser point cloud processing module is used to The laser point cloud data is subjected to feature processing to generate a laser point cloud feature; a feature fusion module is used to perform feature fusion of the bird's-eye view feature and the laser point cloud feature to obtain a fused feature; a time series module, It is used to extract time-series features through a time-series neural...

Embodiment 3

[0089]Corresponding to the above-mentioned embodiments, the present application also provides an electronic device, including a memory, a processor, and a computer program stored in the memory and operable on the processor. When the processor executes the program, the above-mentioned 3D object detection method can be realized.

[0090] Such as Figure 5 As shown, in some embodiments, the system can be used as the above-mentioned electronic device in any one of the above-mentioned embodiments for the 3D object detection method. In some embodiments, a system may include one or more computer-readable media (e.g., system memory or NVM / storage devices) having instructions and coupled to the one or more computer-readable media and configured to execute the instructions to One or more processors (eg, processor(s)) that implement a module such that the actions described in this application are performed.

[0091] For one embodiment, the system control module may include any suitable ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a 3D target detection method and device, electronic equipment and a medium, and relates to the technical field of target detection. The method comprises the following steps: acquiring image data and laser point cloud data in a scene; performing feature processing according to the image data to generate aerial view features; performing feature processing according to the laser point cloud data to generate laser point cloud features; performing feature fusion on the aerial view features and the laser point cloud features to obtain fused features; according to the fused features, time sequence feature extraction is carried out through a time sequence neural network, feature decoding is carried out, and a 3D target frame is obtained; and carrying out loss calculation on the 3D target frame, wherein the loss calculation at least comprises overlapping degree loss calculation. According to the invention, an automatic driving multi-mode 3D target detection algorithm can be improved, and the 3D target detection precision is greatly improved.

Description

technical field [0001] The present application relates to the technical field of target detection, in particular to a 3D target detection method, device, electronic equipment and media. Background technique [0002] With the development of technology, autonomous driving technology is developing rapidly. However, achieving fully autonomous driving is still a daunting task due to the complex and dynamic driving environment. In order to understand the driving environment around the vehicle, self-driving cars need to be equipped with a set of sensors for powerful and accurate environment perception. The set of sensor devices and their associated processing algorithms are called perception systems. The perception system takes data from a set of sensors as input, and after a series of processing, outputs information about the environment, other surrounding objects (such as cars, pedestrians), and the self-driving car itself. [0003] Sensors on self-driving cars usually include...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T9/00G06N3/08G06T3/40
CPCG06T5/50G06T9/002G06T3/4038G06N3/08G06T2207/20221G06T2207/20081G06T2207/20084G06T2207/10028Y02T10/40
Inventor 龚湛
Owner INSPUR SUZHOU INTELLIGENT TECH CO LTD