Object grabbing method and device based on 3D matching and computing equipment

An object, 3D technology, applied in the computer field, can solve problems such as affecting industrial automation, inaccurate determination, robot grasping errors, etc., to achieve the effect of improving the object grasping accuracy and optimizing the object grasping method

Pending Publication Date: 2021-05-25
MECH MIND ROBOTICS TECH LTD
View PDF5 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the prior art, the determination of the pose information of the object to be grasped is not accurate enough, which may easily cause the robot to make a grasping error. For example, the robot cannot successfully grasp the object, or the object falls after being grasped, or even What is taken is that the pressed object causes the object above the pressed object to fall, etc., which affects the realization of industrial automation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object grabbing method and device based on 3D matching and computing equipment
  • Object grabbing method and device based on 3D matching and computing equipment
  • Object grabbing method and device based on 3D matching and computing equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026]An exemplary embodiment of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although the exemplary embodiments of the present disclosure are shown in the drawings, it is understood that the present disclosure can be implemented in various forms and should not be restricted herein. Instead, it is provided to provide more thoroughly understood the present disclosure, and can communicate the scope of the disclosure to those skilled in the art.

[0027]figure 1 A flow schematic of a 3D matching object grab method is shown in accordance with an embodiment of the present invention, such asfigure 1 As shown, the method includes the following steps:

[0028]Step S101, acquire the scene image of the current scene and the point cloud corresponding to the scene image, and input the scene image to the training-trained depth learning segmentation model to obtain an instance segmentation process to obtain the segmentation results of the in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an object grabbing method and device based on 3D matching and computing equipment, and the method comprises the steps: obtaining a scene image and a point cloud corresponding to the scene image, inputting the scene image into a deep learning segmentation model, and carrying out the instance segmentation processing to obtain the segmentation result of each object in the scene image; according to the point cloud corresponding to the scene image and the segmentation result of each object, determining the point cloud corresponding to each object; for each object, matching the point cloud corresponding to the object with a preset template point cloud, and determining pose information of the object; and according to the point clouds corresponding to all the objects, determining the stacking relation between the objects, determining a target object from all the objects according to the stacking relation, converting the pose information of the target object into a robot coordinate system, and transmitting the converted pose information of the target object to the robot. According to the scheme, the pose information of each object is accurately determined, and the object grabbing accuracy is effectively improved.

Description

Technical field[0001]The present invention relates to the field of computer technology, and more particularly to a 3D matching object grab method, apparatus, and computing device.Background technique[0002]With the development of industrial intelligence, it is increasingly popular by robots in order to operate in artificial objects such as industrial parts, boxes, etc.). When the robot is operated, it is generally necessary to capture the object, move the object from one position and placed to another location, for example from the conveyor belt to move the object and placed on the tray or in the cater, and crawl from the tray The object is placed on the conveyor or other tray as required. However, in the prior art, the determination of the bites information of the object to be grasped is not accurate, which is easy to cause the robot to crawl the mistakes, such as the robot can't succeed in grabbing the object, or the object is dropped and even caught. Taken by the pressure body res...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/10G06T1/00
CPCG06T7/74G06T7/10G06T1/0014G06T2207/10028G06T2207/20081G06T2207/20084G06T2207/30164
Inventor 刘迪一魏海永盛文波李辉段文杰丁有爽邵天兰
Owner MECH MIND ROBOTICS TECH LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products