Workpiece 6D pose estimation method based on deep learning

A pose estimation and deep learning technology, applied in computing, image data processing, instruments, etc., can solve problems such as limited real-time performance, difficult to guarantee accuracy, and limited accuracy of pose estimation, achieving fast real-time accurate estimation and easy processing Effects of occlusion, improving accuracy and efficiency

Pending Publication Date: 2020-11-06
GUANGZHOU INST OF ADVANCED TECH CHINESE ACAD OF SCI
View PDF0 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The method based on corresponding points is mainly to find the feature point correspondence between the input data and the 3D point cloud of the known object. Usually, features such as SIFT and SURF are extracted from the RGB-D data for feature matching. This method is fast, but the accuracy Difficult to guarantee
Due to complex environments such as occlusion, sensor noise, and lighting changes between workpieces, relying on manual feature extraction and fixed feature matching processes severely limits the accuracy of pose estimation, making it difficult to quickly and accurately estimate the pose of workpieces in complex environments.
The template-based method is mainly to select a sim

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Workpiece 6D pose estimation method based on deep learning
  • Workpiece 6D pose estimation method based on deep learning
  • Workpiece 6D pose estimation method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] In order to make the above objects, features and advantages of the present invention more comprehensible, the technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments. It should be pointed out that the described embodiments are only a part of the embodiments of the present invention, rather than all embodiments. Based on the embodiments of the present invention, all those skilled in the art can obtain without creative work. Other embodiments all belong to the protection scope of the present invention.

[0050] Such as figure 1 and figure 2 As shown, the present invention provides a method for estimating the 6D pose of a workpiece based on deep learning, comprising the following steps:

[0051] Step S1: image acquisition and preprocessing;

[0052] Specifically include the following steps:

[0053] Step S101: Use the depth camera to collect images of different workpieces und...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a workpiece 6D pose estimation method based on deep learning. The invention relates to the technical field of robot environment perception. The method specifically comprises the following steps: collecting different workpiece images under different backgrounds and illumination conditions; constructing a semantic segmentation model to segment a target object; three-dimensional point cloud coordinates are converted into pixel coordinates for representation through a space conversion network, three-dimensional point cloud data and RGB information are fused, a dense fusionnetwork is constructed to estimate 3D position information and 3D direction information of an object, an ICP algorithm is adopted to iteratively match and finely adjust the pose, and therefore accurate 6D pose information of the object is obtained. Compared with the traditional scheme, the method has the advantages that the end-to-end 6D pose of the target object can be quickly and accurately estimated in real time in complex environments such as occlusion and disorder, and the problems of poor adaptability, low accuracy and limited real-time performance of the traditional pose estimation method in the real complex environment are effectively solved.

Description

technical field [0001] The invention relates to the technical field of robot environment perception, in particular to a method for estimating the 6D pose of a workpiece based on deep learning. Background technique [0002] The 6D pose (ie, 3D position and 3D orientation) of an object plays a key role in applications such as industrial robots, virtual reality, and automatic navigation systems. For the field of robotics, accurately estimating the 6D pose of an object is the premise and basis for precise robot grasping, which can provide position information and attitude information of objects for tasks such as robot grasping operations and motion planning. [0003] At present, when the robot performs grasping operations, the 6D pose estimation methods for objects mainly include methods based on corresponding points, methods based on templates, and methods based on voting. The method based on corresponding points is mainly to find the feature point correspondence between the i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/73G06T7/10
CPCG06T2207/10028G06T2207/20081G06T2207/20084G06T2207/20221G06T7/10G06T7/73
Inventor 雷渠江李秀昊潘艺芃徐杰桂光超梁波刘纪王卫军韩彰秀
Owner GUANGZHOU INST OF ADVANCED TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products