Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot positioning and grabbing method and system based on laser visual guidance

A robot positioning and visual guidance technology, applied in the field of robotics, can solve the problems of being unable to support the bottom of objects, and achieve the effect of easy promotion and use and reasonable structure

Active Publication Date: 2021-11-12
广州市斯睿特智能科技有限公司
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The invention proposes a robot positioning and grasping method and system based on laser vision guidance, which solves the problem that the robot in the prior art cannot effectively hold the bottom of the object when grasping the object

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot positioning and grabbing method and system based on laser visual guidance
  • Robot positioning and grabbing method and system based on laser visual guidance
  • Robot positioning and grabbing method and system based on laser visual guidance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention.

[0033] refer to Figure 1-7 , a robot positioning system based on laser vision guidance, including a base 1, a host computer 37 installed on the top of the base 1, a robot arm 2 and a grabbing mechanism installed on the robot arm 2, the output end of the host computer 37 is connected to the robot arm 2 The input end of the gripper is electrically connected with the input end of the grasping mechanism;

[0034] The grabbing mechanism includes a top plate 3 installed on the mechanical arm 2, a visual positioning device 5 installed on the bottom of the top plate 3, a fixed plate 4, and a mounting plate 10 movably installed on the fixed plate 4. The positioning device 5 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of robots, in particular to a robot positioning and grabbing method and system based on laser visual guidance, and provides the following scheme aiming at the problem that in the prior art, when a robot grabs an object, the bottom of the object cannot be effectively supported: the robot positioning and grabbing method comprises the following steps that S1, a target object is measured through a visual positioning device, and the position of the target object is determined; S2, after the position of the target object is determined, a mechanical arm is started through an upper computer, and the target object is grabbed; and S3, after the object is grabbed, the target object is lifted up, and meanwhile, the bottom of the target object is supported. According to the method and the system, the structure is reasonable, the target object can be accurately and conveniently grabbed, the bottom of the target object can be supported through a supporting plate after the target object is grabbed, the supporting plate can be transferred to the position deviating from the position under the target object before the target object is put down, the target object can be smoothly put down, and promotion and application are easy.

Description

technical field [0001] The invention relates to the field of robots, in particular to a method and system for robot positioning and grasping based on laser vision guidance. Background technique [0002] At present, the use of industrial robots is relatively common in existing life. It is common to use robots to grab products in industrial production. By applying the vision system to robots, it can effectively improve the robot's accuracy during use. Grasping accuracy. At present, the common visual measurement technology is stereo vision, including binocular vision and laser line scanning technology. Both of these technical solutions can effectively position the robot, so as to ensure the accuracy of grasping. [0003] However, when the robot in the prior art grabs an object, it often uses a clamp to grab the object and cannot hold the bottom of the object immediately, so that the object is likely to slip and fall during the transfer process. Therefore, this proposal propose...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16B25J18/00B25J19/04
CPCB25J9/1697B25J18/00B25J19/04
Inventor 贾春英粟子谷彭坤旺
Owner 广州市斯睿特智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products