Workpiece pose rapid high-precision estimation method and apparatusbased on point cloud data

A point cloud data and workpiece technology, which is applied in the field of fast and high-precision estimation of workpiece pose based on point cloud data, can solve the problems of inability to meet the requirements of robot grasping and assembly accuracy, expensive computing equipment, and low pose estimation accuracy. , to achieve the effect of fast time, wide application and improved accuracy

Active Publication Date: 2019-12-31
HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
View PDF9 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the method based on deep learning needs to use a large amount of data for training. The current method of using deep learning to do pose estimation based on point cloud mainly uses the data in the public data set, mainly for some situations in life scenes. The pose estimation of the item, its pose estimation accuracy is too low to meet the robot grasping and assembly accuracy requirements in industrial scenarios
At the same time, the method of deep learning needs to consume more time for training, and the price of its computing equipment is extremely expensive, and it is not popular in the industry at present.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Workpiece pose rapid high-precision estimation method and apparatusbased on point cloud data
  • Workpiece pose rapid high-precision estimation method and apparatusbased on point cloud data
  • Workpiece pose rapid high-precision estimation method and apparatusbased on point cloud data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In the following, the concept, specific structure and technical effects of the present invention will be clearly and completely described in conjunction with the embodiments and the drawings, so as to fully understand the objectives, solutions and effects of the present invention.

[0063] This article mainly discloses a method for estimating the pose of a workpiece based on a point cloud, which is mainly used to solve the problem of workpiece pose estimation in the field of automated assembly, especially in the field of small parts assembly. Such as figure 1 As shown, the method is divided into three steps: point cloud data preprocessing, point cloud virtual view extraction and pose estimation. The point cloud data preprocessing is mainly to segment the object point cloud acquired by the sensor as the target point cloud; convert the CAD model of the object into point cloud data, which is used to generate the point cloud view of the virtual 3D camera. Point cloud virtual ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a workpiece pose estimation method based on point cloud, which is mainly used for solving the problem of workpiece pose estimation in the field of automatic assembly, especially in the field of small part assembly. The method comprises the steps of point cloud data preprocessing, point cloud virtual view extraction and pose estimation. Wherein the pose estimation step further comprises the steps of performing iterative operation by adopting a particle filter algorithm based on a dynamic model of an iterative closest point algorithm, outputting the pose of the effectiveparticle in a weighted mean mode if an iterative stop condition is met, and calculating the pose of the target workpiece relative to the camera coordinate system through inversion operation. The invention also relates to an apparatus comprising a memory and a processor implementing the above method steps when the processor executes a program stored in the memory.

Description

Technical field [0001] The invention relates to the technical field of machine vision object pose estimation, in particular to a method and device for fast and high-precision estimation of the pose of a workpiece based on point cloud data. Background technique [0002] With the development of industrial robot technology, more and more automated production lines adopt robot operations. In the fields of automatic robot sorting and automatic assembly, vision-guided robot operations are becoming more and more important. Traditional vision guidance is based on 2D images. In recent years, three-dimensional scanning equipment (such as three-dimensional laser scanners, structured light cameras, etc.) has been used to obtain three-dimensional point cloud data on the surface of the workpiece (ie, the workpiece point cloud), and the workpiece point cloud and the workpiece The template point cloud sampled by the CAD model is registered to obtain the rigid transformation between the two, so a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73
CPCG06T7/73G06T2207/10028Y02P90/30
Inventor 楼云江王瑶杨先声古维昆董邓伟
Owner HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products