Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Point cloud collision detection method applied to robot grabbing scene

A collision detection and robot technology, applied in the field of robot vision, which can solve problems such as unstable grasping

Active Publication Date: 2020-12-11
FOSHAN LONGSHEN ROBOT
View PDF10 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a point cloud collision detection method for a robot grabbing scene in order to overcome the current technical defect that the grippers of the robot will collide with other workpieces during the process of grabbing the workpiece and cause collisions, resulting in unstable grabbing.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Point cloud collision detection method applied to robot grabbing scene
  • Point cloud collision detection method applied to robot grabbing scene
  • Point cloud collision detection method applied to robot grabbing scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0071] like figure 1 As shown, a point cloud collision detection method for robot grabbing scene, including the following steps:

[0072] S1: Construct the bounding box model of the robot gripper, and collect the point cloud data of the workpiece area through the robot camera;

[0073] S2: Establish the robot coordinate system, the vertex coordinate system of each vertex of the bounding box model and the point coordinate system of each point in the point cloud, and obtain the homogeneous transformation matrix of each vertex coordinate system and each point coordinate system in the robot coordinate system;

[0074] S3: Obtain the coordinates of each vertex of the bounding box model and each point of the point cloud in the robot coordinate system according to the homogeneous transformation matrix;

[0075] S4: According to the coordinates of each vertex of the bounding box model and each point of the point cloud in the robot coordinate system, judge the relationship between eac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a point cloud collision detection method applied to a robot grabbing scene. The method comprises the following steps of S1, constructing a bounding box model of a robot clampingjaw, and collecting point cloud data in a workpiece region; S2, establishing a robot coordinate system, a vertex coordinate system and a point coordinate system, and obtaining a homogeneous transformation matrix; S3, obtaining the coordinates of each vertex of the bounding box model and the coordinates of each point of a point cloud under the robot coordinate system respectively according to thehomogeneous transformation matrix; S4, judging the relationship between each point of the point cloud and the bounding box model, and obtaining the quantity of the points, in the bounding box model, of the point cloud; and S5, comparing the quantity of the points, in the bounding box model, of the point cloud with a preset threshold so as to detect whether the clamping jaw collides with an actualobject or not in advance. According to the point cloud collision detection method applied to the robot grabbing scene, the problem of unstable grabbing caused by a collision conflict between a clamping jaw and other workpieces touched by the clamping jaw in the workpiece grabbing process of a robot currently is solved.

Description

technical field [0001] The present invention relates to the technical field of robot vision, and more specifically, to a point cloud collision detection method for a robot grabbing scene. Background technique [0002] With the reduction of labor costs and the development of robotics and computer vision technology, the proportion of robots used in the production process will increase. 3D vision-guided robot grasping is a key technology to realize the intelligent production of robots. At present, due to the complexity of the production environment and the instability of 3D visual recognition, in the actual production process, 3D vision will recognize that the underlying workpiece or the clamping position of the workpiece interferes with each other, so that the jaws will encounter other objects during the grasping process. Workpieces cause collisions, resulting in unstable gripping. [0003] In the prior art, such as the Chinese patent disclosed on July 12, 2019, a robot moti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16G06T1/00G06T7/70G06T17/00
CPCB25J9/163B25J9/1676G06T1/0014G06T17/00G06T2207/10028G06T2207/30164G06T2210/12G06T2210/21G06T7/70
Inventor 汪良红王辉陈新许藤
Owner FOSHAN LONGSHEN ROBOT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products