Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stacked object 6D pose estimation method and device based on deep learning

A deep learning and pose estimation technology, applied in computing, computer parts, character and pattern recognition, etc., can solve the problems of consuming large computing time and computer memory, relying on manual debugging, etc., to achieve accurate pose estimation and training speed. And the effect of fast running speed and strong algorithm generalization ability

Active Publication Date: 2020-06-09
SHENZHEN GRADUATE SCHOOL TSINGHUA UNIV
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage is: PPR-Net needs to be clustered after the network output, and clustering needs to set some hyperparameters, and different parameters need to be set for different objects, which makes the algorithm framework still rely on manual debugging in the later stage
At the same time, point-by-point regression consumes a lot of computing time and computer memory in the case of a large number of scene points

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stacked object 6D pose estimation method and device based on deep learning
  • Stacked object 6D pose estimation method and device based on deep learning
  • Stacked object 6D pose estimation method and device based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] Embodiments of the present invention will be described in detail below. It should be emphasized that the following description is only exemplary and not intended to limit the scope of the invention and its application.

[0039] image 3 It is a flowchart of a 6D pose estimation method for stacked objects based on deep learning according to an embodiment of the present invention. refer to image 3 , the embodiment of the present invention proposes a method for estimating the 6D pose of a stacked object based on deep learning, including the following steps:

[0040] S1. Input the point cloud of scene depth information obtained by the depth camera into the point cloud deep learning network to extract point cloud features;

[0041] S2. Through the multi-layer perceptron MLP, the extracted point cloud features learn to return the semantic information of the object, the foreground and background of the scene, and the 3D translation information of the object. The points obt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a stacked object 6D pose estimation method and device based on deep learning. The method comprises the steps: inputting a point cloud of scene depth information into a point cloud deep learning network, and extracting the features of the point cloud; learning semantic information of an object to which point cloud features belong, a foreground and a background of a scene and3D translation information of the object to which the point cloud features belong through a multi-layer perceptron, and performing regression to obtain seed points; randomly sampling K points in theseed points, K being greater than the number of to-be-estimated objects, and classifying the seed points by taking the K points as central points; predicting 3D translation, 3D rotation and 6D pose confidence of the object according to the features of each type of points through a multilayer perceptron; According to the predicted 6D pose and the 6D pose confidence coefficient, using a non-maximumsuppression NMS method to obtain a final scene object pose. According to the method, accurate estimation of the pose of the stacked scene is realized end to end, the input is the scene point cloud, the pose of each object in the scene is directly output, and the shielding problem of the stacked objects can be well solved.

Description

technical field [0001] The invention relates to a method and device for estimating the 6D pose of a stacked object based on deep learning. Background technique [0002] Bin picking is a core problem in robotics and computer vision. The goal is to allow a robot with a visual sensor to grab randomly placed objects through an end effector (suction cup, two-finger manipulator). In order to accurately grasp and place objects, 6D pose estimation of objects is required, that is, rotation and translation. The translation is the XYZ coordinates of the object, and the rotation is the attitude of the object, which can be represented by a rotation matrix (or Euler angle, or unit four elements). [0003] In the case of serious stacking, it is difficult to estimate the pose of all stacked objects. Traditional methods use local feature matching in RGB images, such as SSD-6D ([1] W. Kehl, F. Manhardt, F. Tombari, S. Ilic, and N. Navab, "Ssd-6d: Making rgb-based 3d detection and 6d pose e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06T7/50G06T7/73
CPCG06T7/50G06T7/73G06F18/23213G06F18/214
Inventor 刘厚德刘思成朱晓俊梁斌王学谦高学海
Owner SHENZHEN GRADUATE SCHOOL TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products