An automatic workpiece grabbing method of the robot based on an RGB-D image and a CAD model

A RGB image and robot technology, which is applied in the field of robot automatic grabbing of workpieces based on RGB-D images and CAD models, can solve the problem of low grabbing accuracy and achieve fast computing speed, high accuracy, and good matching effect

Active Publication Date: 2021-08-06
HARBIN INST OF TECH
View PDF6 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Based on the relative pose of the workpiece and the camera and the relative pose of the camera and the robot, the pose of the workpiece relative to the robot is obtained by matrix transformation, so the automatic grasping of the workpiece is realized, but the grasping accuracy is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An automatic workpiece grabbing method of the robot based on an RGB-D image and a CAD model
  • An automatic workpiece grabbing method of the robot based on an RGB-D image and a CAD model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014] The method of the present invention is divided into three stages of workpiece image instance segmentation, point cloud matching and robot capture, and the specific steps are as follows.

[0015] 1. Data collection:

[0016] 1-1 Collect the data of the workpiece through the color camera and the depth camera (obtain multiple images and establish a data set), and obtain the RGB image I (multiple images, data sets) and depth image I (multiple images, data sets) of the workpiece );

[0017] 1-2. According to the relative position of the two cameras, the homography matrices of the two camera perspective transformations are calculated. The RGB image I and the depth image I are converted through the homography matrix, and the RGB image I and the depth image I are calculated to be aligned RGB Image I and Depth Image I.

[0018] 1-3. Adjust the aligned RGB image I and depth image I to the same size to obtain RGB image II and depth image II.

[0019] 2. Send the obtained RGB im...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to automatic grabbing of a robot, in particular to an automatic workpiece grabbing method of the robot based on an RGB-D image and a CAD model. The method comprises the following steps: establishing a virtual environment of a virtual camera and a workpiece CAD model based on VTK; segmenting the image by using a neural network to obtain a 3D point cloud of the target workpiece; and then matching with a virtual CAD model point cloud. The method is good in matching effect and high in speed, and the problems that the robot is inaccurate in workpiece grabbing, poor in effect and the like are solved.

Description

technical field [0001] The invention relates to robot automatic grabbing, more specifically a method for robot automatic grabbing of workpieces based on RGB-D images and CAD models. Background technique [0002] Neural network is the key technology to realize artificial intelligence, and the establishment of artificial neural network with hierarchical structure can realize artificial intelligence in computing system. A neural network is an algorithm designed to mimic the structure of the human brain and is used to recognize things. Neural networks interpret sensory data through machine perception systems, enabling operations such as labeling or clustering raw inputs. The patterns that neural networks can recognize are in numerical form, so all real-world data such as images, sounds, texts, and time series must be converted into numerical forms. [0003] The traditional pose estimation methods before workpiece grasping mainly include point matching and template matching. P...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/11G06T7/50G06K9/38G06K9/62G06N3/04G06N3/08
CPCG06T7/0004G06T7/50G06T7/11G06N3/08G06T2207/10004G06T2207/10024G06T2207/10028G06T2207/20081G06T2207/20084G06T2207/30164G06V10/28G06N3/045G06F18/241
Inventor 陈晓峰王崇张旭堂孔民秀
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products