Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

6D pose estimation method based on an instance segmentation network and iterative optimization

A pose estimation and iterative optimization technology, applied in biological neural network models, computing, image analysis, etc., can solve the problems of consuming large computing resources, less texture features, and low time efficiency, and achieve strong adaptability and robustness , Improve detection performance, improve the effect of running speed

Active Publication Date: 2019-05-24
TONGJI UNIV
View PDF8 Cites 42 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The industrial application prospects of such solutions are limited. In the case of cluttered backgrounds or stacked workpieces, it is difficult to identify workpieces in the field of view through traditional image processing solutions such as edge detection and template detection; at the same time, edge detection, etc. There are some parameters that need to be manually adjusted in the algorithm, and it is often difficult to adapt to different grasping objects, that is, different shapes, different sizes, or even mixed types of workpieces
[0006] From the perspective of pose estimation algorithms, existing patents are still limited to traditional methods of manually designing features and extracting features, and then matching the features extracted from the objects in the actual scene with the features of the model or template.
The artificial feature extraction and template matching scheme has low time efficiency on the one hand, and the search algorithm in the template matching process needs to consume a lot of computing resources. On the other hand, the traditional feature extraction strategy requires the target to be detected to have distinguishable texture features, while This is often difficult to meet in the application of the industrial field. Most of the workpieces in the industrial field are less textured. The traditional feature extraction method cannot meet this requirement. When faced with a variety of different types and shapes of industrial workpieces, Traditional solutions are also unable to adapt
[0007] To sum up, the traditional solutions can only obtain the two-dimensional plane attitude of the workpiece, or the accuracy and robustness of the algorithm are not strong, and cannot adapt to complex life and production application scenarios, and cannot guarantee the accuracy of various shapes, Various types of objects can accurately identify and calculate the pose, which cannot meet the requirements of complex life services and industrial production

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 6D pose estimation method based on an instance segmentation network and iterative optimization
  • 6D pose estimation method based on an instance segmentation network and iterative optimization
  • 6D pose estimation method based on an instance segmentation network and iterative optimization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0117] In order to verify the effect of this scheme, the present invention carried out object recognition experiments and pose estimation experiments respectively, which were used to evaluate the recognition effect of the instance segmentation network and the accuracy of the final output pose.

[0118] In order to verify the effect of object recognition, we conducted experiments on an existing "Shelf&Tote" Benchmark dataset and our own collected dataset. The objects in the "Shelf&Tote" Benchmark dataset are rich in texture features, and we The objects in the data set collected by myself lack texture information, and there are a large number of similar objects stacked and mixed.

[0119] Whether it is the "Shelf&Tote" Benchmark dataset or the self-collected dataset, there are good recognition results.

[0120] In order to evaluate the performance of this method, the pose error is defined as follows:

[0121]

[0122] Based on the attitude error, we define the accuracy rate ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a 6D pose estimation method based on an instance segmentation network and iterative optimization. The 6D pose estimation method comprises the following steps: 1) converting adepth image into an HHA feature map and a scene point cloud; 2) inputting the HHA feature map and the color image into an instance segmentation network added with a spatial transformation network, andperforming instance segmentation of any pose object to obtain an object type identification result and a mask segmentation result; 3) segmenting a target point cloud on the scenic spot cloud according to an instance segmentation result; And 4) according to an improved 4PCS algorithm and an ICP algorithm, performing matching and pose finishing on the segmented target point cloud and the model point cloud of the target CAD model, thereby obtaining an accurate pose estimation result. Compared with the prior art, the method has the advantages of accurate recognition, multi-type object recognition, high detection performance, high pose matching precision and the like.

Description

technical field [0001] The invention relates to the technical field of environment perception of robots, in particular to a 6D pose estimation method based on instance segmentation network and iterative optimization. Background technique [0002] The environmental perception technology of robots is an important scientific issue in the field of robotics research. In recent years, with the development of computer vision and deep learning technology, vision-based environmental perception has become a hot spot in academia and industry. Robots realize the perception of their environment and operating objects through the input of visual information, including recognition and state estimation, etc., and then realize interaction and complete tasks. Among them, the introduction of 3D vision technology enables robots to obtain more abundant information, which plays an important role in promoting the solution of robot environment perception problems. [0003] The present invention foc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06T7/77G06N3/04
Inventor 陈启军周光亮王德明颜熠李勇奇刘成菊
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products