Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth 6D pose estimation network model and workpiece pose estimation method

A pose estimation and network model technology, applied in biological neural network models, neural learning methods, calculations, etc., can solve the problems of unreliable RGB color information, difficulty in robot grasping, weak texture, etc., to ensure recognition accuracy and real-time performance, good generalization ability, and the effect of reducing labor costs

Pending Publication Date: 2022-04-08
HEBEI UNIV OF TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Most industrial parts are rigid objects with standard production models, and there is almost no change within the class, which reduces the difficulty of identification. However, most parts in industrial scenes have weak textures, similar or identical colors. In this case, RGB color information is no longer reliable
Therefore, in order to improve the reliability of recognition, this paper starts with the point cloud that only contains 3D geometric information, and mines the 6D pose of the object from the edge information and geometric relationship of the object; on the other hand, the object to be captured is often in a messy scene. , mutual occlusion and stacking between objects are inevitable, which still brings great difficulties to tasks such as vision-guided robot grasping

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth 6D pose estimation network model and workpiece pose estimation method
  • Depth 6D pose estimation network model and workpiece pose estimation method
  • Depth 6D pose estimation network model and workpiece pose estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the drawings in the embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0038] The present invention provides a deep 6D pose estimation network model, which includes three sub-modules of a point cloud segmenter, a feature clustering sampler and an attitude estimator:

[0039] The point cloud segmenter (see figure 2 ) includes a feature extractor, a feature generator and a feature discriminator, wherein the feature extractor is composed of multiple SA (abbreviation of Set Abstraction, feature point sampling and feature extraction module, the same below) layers, and the feature generator is composed of multiple FP ( Feature Propagation, a f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth 6D pose estimation network model and a workpiece pose estimation method, the workpiece pose estimation method solves the problem that a large-volume point cloud data set is difficult to acquire through a method of generating a simulation data set by a physical engine, and a complete scene point cloud is directly used as input by a pure geometric point cloud coordinate. Through the semantic and instance segmentation part of the point cloud, the local and global features of the input point cloud can be extracted, the ability of the network to understand the scene is improved, the accurate attitude is output through the multi-layer feature fusion pose estimation network, the problems of object stacking and self-shielding are solved to a certain extent, the robustness is provided for various symmetrical objects, and the method is suitable for large-scale popularization and application. Through experimental verification of simulating a data set of a real scene, the method provided by the invention has obvious advantages in overall precision and stability, and has higher robustness.

Description

technical field [0001] The invention relates to the technical field of machine vision, and specifically provides a deep 6D pose estimation network model and a workpiece pose estimation method. Background technique [0002] With the advancement of technology, robots are increasingly widely used in grasping, assembly, packaging, processing, logistics sorting, etc. Among them, the grasping and assembly operation is the most common application scenario of robots, and a major challenge is the accurate grasping of objects in complex environments. In the traditional structured environment, by manually searching for the teaching point and programming it into the program, and then running according to the fixed program, although high accuracy and success rate can be achieved, due to the lack of environmental awareness and interaction, it is difficult Complete precision grasping and assembly operations in complex scenarios such as structured or semi-structured. [0003] A reliable r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06N3/04G06N3/08
Inventor 陈海永李龙腾
Owner HEBEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products