Multi-scale three-dimensional target detection method based on feature pyramid network

A technology of feature pyramid and detection method, applied in the field of computer vision

Active Publication Date: 2020-10-27
SICHUAN UNIV
View PDF4 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is based on the feature pyramid network, which integrates radar point cloud and RGB image features to improve the detection rate and accuracy of partially occluded targets and long-distance small targets in complex scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-scale three-dimensional target detection method based on feature pyramid network
  • Multi-scale three-dimensional target detection method based on feature pyramid network
  • Multi-scale three-dimensional target detection method based on feature pyramid network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] The implementation of the following method is described in further detail:

[0016] 1. Convert from a voxel grid with 0.1m resolution to a six-channel aerial view. First, filter the point cloud. According to the definition of the point cloud coordinate system on the KITTI benchmark, only the points within [0,70][-40,40][0,2.5] on the three axes are considered. At the same time, the grid is evenly divided into 5 slices on the Z axis, corresponding to the five channels of the bird's-eye view, and encoded using the maximum height of all points in the cell on the slice. The sixth channel represents the point density information of the unit in the XY plane of the overall point cloud, and the calculation formula is: Where N represents the number of points in the unit, so a bird's-eye view with dimensions (800,700,6) can be obtained. By representing the 3D point cloud as a regular bird's-eye view, it can directly use the mature image feature extractor to obtain effective an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-scale three-dimensional target detection method based on a feature pyramid network. The objective of the invention is to solve the problem that a target is easy to loseunder complex conditions (such as target partial occlusion and low long-distance imaging resolution). According to the method, the advantages of RGB images and radar point clouds in a detection taskare combined, 3D point clouds are expressed as multi-channel aerial views, a pyramid feature extractor is designed, and multi-scale and strong semantic feature expressions of the aerial views and theRGB images are constructed respectively; and a prior anchor box is applied to each feature output layer, regional fusion features are obtained through feature cutting operation, the regional fusion features are directly input into a shared classifier and a regression device to complete cross-scale detection, and an optimal target classification and positioning result is obtained. The method is a single-stage detection method, the step of generating a proposal box is omitted, the network structure is simple; the real-time performance, the accuracy and the robustness of detection are guaranteed,and the detection rate and the positioning accuracy of a partially-shielded and long-distance target are effectively improved.

Description

technical field [0001] The invention relates to a three-dimensional target detection algorithm, which is used to improve the detection rate and accuracy rate of partially occluded targets and long-distance small targets in complex environments, thereby helping machines to better perceive the three-dimensional environment and determine the position of the target of interest. field of computer vision. Background technique [0002] 3D object detection aims to study how to effectively perceive environmental information, accurately classify and locate objects of interest, and plays an important role in automatic driving systems. The development of deep learning has made great breakthroughs in two-dimensional vision tasks such as image recognition and semantic segmentation. However, the real world is a three-dimensional space, and research based on two-dimensional images has certain limitations in real-world applications. Compared with 2D detection, 3D target detection adds the e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06V2201/07G06F18/24G06F18/254G06F18/253
Inventor 刘怡光赵杨玉杨艳陈杰唐天航朱先震
Owner SICHUAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products