Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Online hand-eye calibration and grabbing pose calculation method for three-dimensional vision hand-eye system of four-degree-of-freedom parallel robot

A technology of stereo vision and hand-eye calibration, applied in computing, manipulators, program-controlled manipulators, etc., can solve the problem that the vertical component of parallel robots cannot be accurately obtained.

Active Publication Date: 2019-09-24
JIANGSU UNIV
View PDF3 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

And based on the vertical constraints of the calibration plate and the end clamping mechanism, the vertical component of the hand-eye calibration is corrected to solve the problem that the existing traditional hand-eye model calculation method cannot accurately obtain the vertical component of the 4-R (2-SS) parallel robot

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Online hand-eye calibration and grabbing pose calculation method for three-dimensional vision hand-eye system of four-degree-of-freedom parallel robot
  • Online hand-eye calibration and grabbing pose calculation method for three-dimensional vision hand-eye system of four-degree-of-freedom parallel robot
  • Online hand-eye calibration and grabbing pose calculation method for three-dimensional vision hand-eye system of four-degree-of-freedom parallel robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0139] Specific embodiments are described by taking a new type of 4-R (2-SS) parallel robot fruit sorting system developed by our research group as an example, and white rosa grape bunches as the grasping objects. Its specific implementation is as follows:

[0140] 1. Improved stereo vision Eye-to-hand model with motion error compensation. Specific steps are as follows:

[0141] (1) Construction and improvement of stereo vision Eye-to-hand model group. The present invention improves the Eye-to-hand basic model AX=XB based on the results of binocular calibration and in combination with the relative poses of the color camera and the infrared camera. First, the color camera and the infrared camera are modeled separately, and the following can be obtained:

[0142]

[0143] Based on the stereo vision model, formula (1) is transformed, and the improved stereo vision Eye-to-hand model group is obtained:

[0144]

[0145] (2) Improved Eye-to-hand model with motion error com...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an online hand-eye calibration and grabbing pose calculation method for a three-dimensional vision hand-eye system of a four-degree-of-freedom parallel robot. The online hand-eye calibration and grabbing pose calculation method comprises the following steps: firstly, constructing an Eye-to-hand eye basic model with a camera fixed outside a robot body, and a three-dimensional vision model based on nonlinear distortion in the hand-eye system; calibrating a pose relation between motions based on a tail end clamping mechanism, and constructing a non-trivial solution constraint of the Eye-to-hand model for removing invalid poses in calibration motion so as to plan a hand-eye calibration motion of the tail end clamping mechanism of the parallel robot; and finally, constructing a parallel robot grabbing module with error compensation by adopting robot motion errors obtained based on hand-eye calibration, and achieving grabbing pose calculation of the tail end clamping mechanism of the parallel robot based on three-dimensional vision and the 4-R (2-SS). According to the online hand-eye calibration and grabbing pose calculation method, the online hand-eye calibration precision and efficiency of the four-degree-of-freedom 4-R (2-SS) parallel robot three-dimensional vision hand-eye system can be effectively improved, and accurate and rapid grabbing of the parallel robot is further facilitated.

Description

technical field [0001] The present invention relates to the field of machine vision, especially based on machine vision and image processing, for four degrees of freedom 4-R (2-SS) (R means rotary joint, S means spherical pair, 4-R (2-SS) means parallel robot Consisting of 4 R(2-SS) branches with the same kinematic structure) online hand-eye calibration and grasping pose calculation method of parallel robot stereo vision hand-eye system, used for parallel robot to realize automatic stacking string fruit based on stereo vision sorting. Background technique [0002] In recent years, my country's fruit production has grown rapidly, and traditional manual sorting methods have been difficult to meet the needs of modern agricultural production. Automatic fruit sorting based on robotic technology is of great importance to the automation, scale, and precision development of agricultural production and agricultural product processing. significance. In the robot-based automatic fruit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/80G06T7/70G06T1/00B25J9/16
CPCG06T7/85G06T7/70G06T1/0014B25J9/1697B25J9/1612B25J9/1679G06T2207/30164G06T2207/10012G06T2207/10024Y02T10/40
Inventor 高国琴张千
Owner JIANGSU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products