Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

3D target bounding box estimation system based on GIoU

A bounding box, 3D technology, applied in computing, computer components, image data processing, etc., can solve the problem of low accuracy of 3D target bounding box estimation, and achieve the effect of improving calibration accuracy and high accuracy

Inactive Publication Date: 2020-12-08
HARBIN ENG UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem of low estimation accuracy of the existing 3D target bounding box, a GIoU-based 3D target bounding box estimation system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D target bounding box estimation system based on GIoU
  • 3D target bounding box estimation system based on GIoU
  • 3D target bounding box estimation system based on GIoU

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] Technical scheme of the present invention comprises the following steps:

[0049] combine image 3 , the overall process of the present invention is:

[0050] The first step is to build a GIoU-based 3D target bounding box estimation device. It consists of a radar point cloud preprocessing module, a 2D image preprocessing module, and a GIoU-based multi-source fusion module. The radar point cloud preprocessing module is the PointNet neural network model, the 2D image preprocessing module is the Resnet50 neural network model, and the GIoU-based multi-source fusion module is the Dense neural network model.

[0051] The radar point cloud preprocessing module can convert point cloud data into a fixed-dimensional digital feature representation.

[0052] The 2D image preprocessing module converts point 2D image data into a fixed-dimensional digital feature representation.

[0053] The GIoU-based multi-source fusion module can fuse the digital features of point cloud data an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computer vision and pattern recognition, and particularly relates to a 3D target bounding box estimation system based on GIoU which is composed of a radar point cloud preprocessing module, a 2D image preprocessing module and a GIoU-based multi-source fusion module. Point cloud features are acquired by the radar point cloud preprocessing module, imagefeatures are acquired through the 2D image preprocessing module, then point cloud features and image features are fused through the GIoU-based multi-source fusion module, and finally an estimation result of a 3D target bounding box is output. According to the invention, the problem of low estimation accuracy of the existing 3D target bounding box is solved. Calibration accuracy of a 3D target canbe obviously improved, and high-accuracy 3D target bounding box estimation effect is achieved.

Description

technical field [0001] The invention belongs to the technical field of computer vision and pattern recognition, and in particular relates to a GIoU-based 3D object bounding box estimation system. Background technique [0002] In recent years, unmanned driving has gradually attracted the attention of major companies, scholars and even the general public. At present, there are two completely different ways to realize unmanned driving: one is the gradual method adopted by traditional enterprises, that is, starting from the existing assisted driving system, gradually adding functions such as automatic steering and active collision avoidance, and realizing conditional driving. Unmanned driving, and finally achieve unmanned driving when the cost and related technologies meet certain requirements; the other is the "one-step" method represented by high-tech IT companies to directly achieve the ultimate goal of unmanned driving, that is, there is no so-called Human-machine cooperati...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/13G06K9/62G06N3/04G06N3/08
CPCG06T7/13G06N3/08G06T2207/10044G06T2207/10028G06T2207/20081G06T2207/20084G06N3/045G06F18/253
Inventor 杨武孟涟肖唐盖盖苘大鹏吕继光
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products