Robot automatic grabbing system and method based on 3D vision

A robot and 3D technology, applied in manipulators, program-controlled manipulators, chucks, etc., can solve the problems of large amount of 3D point cloud data, low efficiency and low robustness, and achieve good grasping effect and high grasping efficiency. , Robust effect

Pending Publication Date: 2022-03-18
CSIC PRIDENANJINGINTELLIGENT EQUIP SYST CO LTD
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, traditional two-dimensional vision can only be applied to the recognition and positioning of planar objects, and cannot cope with three-dimensional scenes such as tilting and flipping of parts
[0004] 3D cameras can obtain object space point cloud coordinates to make up for the shortcomings of traditional 2D vision, but the large amount of 3D point cloud data, complex processing, and complex communication logic with robots greatly limit the application of 3D vision
However, this method uses binocular depth cameras and RGB cameras inst

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot automatic grabbing system and method based on 3D vision
  • Robot automatic grabbing system and method based on 3D vision
  • Robot automatic grabbing system and method based on 3D vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The technical solution of the present invention will be further described below in conjunction with the accompanying drawings.

[0061] Such as figure 1 Shown, a kind of robot automatic grasping method based on 3D vision of the present invention comprises the following steps:

[0062] The grabbing robot sends a request data instruction to the industrial computer, and the industrial computer receives the request data instruction of the grabbing robot, and the industrial computer judges whether there is a processing result in the cache, and returns the processing result to the grabbing robot if there is, and controls the 3D camera to take pictures if there is no;

[0063] The industrial computer controls the 3D camera to take pictures after receiving the camera command of the grabbing robot; the industrial computer receives the original 3D point cloud data returned by the 3D camera, based on the original 3D point cloud data, judges whether the target part is photographed,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The method comprises the steps that a grabbing robot sends a request data instruction to an industrial personal computer, the industrial personal computer receives the request data instruction and judges whether a processing result is cached or not, if yes, the processing result is returned to the grabbing robot, and if not, a 3D camera is controlled to take a picture; the industrial personal computer receives a photographing instruction of the grabbing robot and then controls the 3D camera to photograph; the industrial personal computer receives the original three-dimensional point cloud data and judges whether a target part is shot or not; if the target part is shot, point cloud processing calculation is carried out based on the original three-dimensional point cloud data to obtain part space pose data and state information, and the part space pose data and the state information are cached as processing results; if the target part is not shot, judging whether all the parts are shot or not to obtain part space pose data and state information, and caching the part space pose data and the state information as a processing result; and the grabbing robot grabs the parts according to the processing result and outputs a photographing instruction and a material frame replacing instruction. The manipulator has the advantages of being high in grabbing success rate and fast in takt.

Description

technical field [0001] The invention relates to the technical fields of industrial robots and machine vision, in particular to a 3D vision-based robot automatic grasping system and method. Background technique [0002] The traditional robot application is to control the robot to complete the predetermined command action through the teaching pendant. However, when the working environment and tasks of the robot change, the robot often cannot adapt to these changes quickly, which greatly limits the application scenarios and application efficiency of industrial robots. [0003] With the development of machine vision technology, machine vision technology is gradually applied to industrial scenarios. However, traditional two-dimensional vision can only be applied to the recognition and positioning of planar objects, and cannot cope with three-dimensional scenes such as tilting and flipping of parts. [0004] 3D cameras can obtain object space point cloud coordinates to make up fo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): B25J9/08B25J9/16B25J15/08B25J19/02
CPCB25J9/08B25J9/1679B25J15/08B25J19/023Y02P90/02
Inventor 陆坤葛楼云鲁小翔张瑞
Owner CSIC PRIDENANJINGINTELLIGENT EQUIP SYST CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products