Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot visual guide positioning algorithm

A robot vision, guided positioning technology, applied in manipulators, program-controlled manipulators, manufacturing tools, etc., can solve problems such as the inability to meet offset material pose correction, and achieve improved material grabbing efficiency, production rhythm, and cost savings. Effect

Inactive Publication Date: 2019-06-07
浙江启成智能科技有限公司
View PDF9 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] The current robot vision-guided positioning method is that the smart camera first takes pictures of the grasped material, then obtains the coordinates of the material, imports it into the workpiece coordinate system of the robot, and corrects the pose of the robot's grasping position to ensure that the grasping position of the fixture remains unchanged relative to the material, thereby To guide the robot to locate and grab and accurately put the material into the precise position of the mold. When the position of the material in the process cannot be photographed by the camera, the existing technical solution cannot satisfy the robot's pose correction for the offset material

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot visual guide positioning algorithm
  • Robot visual guide positioning algorithm
  • Robot visual guide positioning algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] refer to figure 1 , figure 2 , image 3 , Figure 4 with Figure 5 , a kind of robot visual guidance positioning algorithm of the present invention, comprises the following steps:

[0029] Step s1: The robot grabs the material. At this time, the pose of the material relative to the fixture is deviated from the pose of the original material relative to the fixture;

[0030] Step s2: The robot grabs the material, places it under the smart camera, and takes pictures;

[0031] Step s3: The camera calculates the pose of the current material in the camera coordinate system, and transmits the pose data of the original material and the current material to the robot;

[0032] Step s4: The robot undergoes the conversion of the coordinate system pose and corrects the robot TCP;

[0033] Step s5: The robot uses the corrected TCP to accurately place the material into the precise position of the mold.

[0034] Preferably, the step s5 includes the following steps: step s51: ob...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot visual guide positioning algorithm. The robot visual guide positioning algorithm comprises the following steps: S1, a material is grabbed by a robot, and the position of the material relative to a clamp is offset relative to that of an original material relative to the position of the clamp; S2, the robot is used for grabbing the material and placing the material inan intelligent camera for photographing; S3, the camera calculates the pose of a current material under a camera coordinate system, and transmits the original material pose and the pose data of the current material to the robot; S4, the robot is used for converting the pose of the coordinate system and correcting a TCP of the robot; and S5, the robot uses the corrected TCP, the material can be accurately placed in the accurate position of a mold. According to the robot visual guide positioning algorithm, the integrity of the visual guide positioning solution of the industrial robot can be effectively supplemented, the robot is used for grabbing and positioning the material, a conveyor belt or a support is not needed, the cost is saved, the material grabbing efficiency is improved, and thematerial grabbing accuracy is improved.

Description

【Technical field】 [0001] The invention relates to the technical field of industrial robots, in particular to the technical field of robot vision-guided positioning algorithms. 【Background technique】 [0002] The current robot vision-guided positioning method is that the smart camera first takes pictures of the grasped material, then obtains the material coordinates, imports the robot’s workpiece coordinate system, and corrects the robot’s grasping position to ensure that the gripping position of the fixture remains unchanged relative to the material, thereby To guide the robot to locate and grab and accurately put the material into the mold at the precise position. When the position of the material in the process cannot be photographed by the camera, the existing technical solution cannot satisfy the robot to correct the position and posture of the offset material. 【Content of invention】 [0003] The purpose of the present invention is to solve the problems in the prior ar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
Inventor 王川杨凡苑诗宾曾自立刘道德敬强牟威
Owner 浙江启成智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products