Method for detecting three-dimensional object based on point fusion network

A technology that integrates networks and 3D objects, applied in 3D object recognition, instrument, character and pattern recognition, etc., can solve problems such as not being universal

Inactive Publication Date: 2018-06-15
SHENZHEN WEITESHI TECH
View PDF2 Cites 53 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem that existing methods do not have universal applicability, the purpose of the present invention is to provide a 3D object detection method based on point fusion network. The point cloud network model ingests the original point cloud, learns the spatial encoding of each point and aggregates the global points Cloud features, these features are used for classification and semantic segmentation, the fusion network takes the image features extracted by the convolutional neural network and the corresponding point cloud features generated by the sub-network of the point fusion network as input, it combines these functions and generates the target The object outputs a 3D bounding box, and the supervised scoring function is used to directly train the network to predict whether the point is within the target bounding box, while the unsupervised scoring function can help the network choose the best predicted point

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for detecting three-dimensional object based on point fusion network
  • Method for detecting three-dimensional object based on point fusion network
  • Method for detecting three-dimensional object based on point fusion network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0034] figure 1 It is a system frame diagram of a 3D object detection method based on a point fusion network in the present invention. It mainly includes point cloud network, fusion network, and dense fusion prediction scoring function.

[0035] Point fusion has three main components: a point fusion network variant that extracts point cloud features, a convolutional neural network (CNN) that extracts image appearance features, and a fusion network that combines the two features and outputs a 3D bounding box.

[0036] The point fusion network first uses a symmetric function (max pooling) to achieve the invariance processing of unordered 3D point clouds; the model ingests the original p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention puts forward a method for detecting a three-dimensional object based on a point fusion network. The method comprises a point cloud network, a fusion network, and dense fusion predictionscoring functions. The method comprises the following steps: a point cloud network model is used for taking in original point cloud, learning spatial coding of each point and aggregating global pointcloud features; the features are used for classification and semantic segmentation; in the fusion network, image features extracted via a convolutional neural network and corresponding point cloud features generated via a sub-network of the point fusion network are used as input; these functions are combined, a three-dimensional bounding box is output for a target object, a supervisory scoring function is used to directly train the network to predict whether points are within the target bounding box, and an unsupervised scoring function can help the network select a best prediction point. Viathe method, how to combine images and depth information optimally can be directly learned; lossy input preprocessing such as quantization or projection or the like can be prevented; the method can begenerally applicable, and accuracy thereof is greatly improved.

Description

technical field [0001] The invention relates to the field of object detection, in particular to a three-dimensional object detection method based on a point fusion network. Background technique [0002] The detection and recognition of 3D objects is an important research direction in the field of computer vision, and point fusion is a general 3D object detection method, which can use both image and 3D point cloud information. The application of 3D point cloud data in industry, especially in reverse engineering, has become more and more popular. The main application scenario of 3D object detection and recognition on point cloud data is to identify the existing 3D object model from the obtained point cloud data. 3D object detection technology can be applied to a variety of urban planning, construction and management projects to detect pedestrians, vehicles, shops, etc. in urban scenes; it can also be assisted by automatic detection of pedestrians, vehicles, traffic signs and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/64G06F18/241G06F18/253
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products