Depth camera hand-eye calibration method based on CALTag and point cloud information

A hand-eye calibration and depth camera technology, which is applied in image data processing, instruments, calculations, etc., can solve the problems of low accuracy and deviation of coordinate transformation capture

Active Publication Date: 2019-12-10
XI AN JIAOTONG UNIV
View PDF5 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the point cloud coordinate system of the depth camera does not exactly coincide with the two-dimensional image coordinate system, there will be a certain deviation when directly applying the traditional monocular industrial camera hand-eye calibration method to the depth camera.
In the scene of grabbing objects with arbitrary poses, the robotic arm needs to be able to accurately locate any position in the workspace, and the existing hand-eye calibration method cannot accurately determine the coordinate transformation between the point cloud coordinate system and the robot coordinate system, resulting in poor grasping accuracy. Not high, so it cannot meet the requirements of crawling

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth camera hand-eye calibration method based on CALTag and point cloud information
  • Depth camera hand-eye calibration method based on CALTag and point cloud information
  • Depth camera hand-eye calibration method based on CALTag and point cloud information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The present invention will be described in detail below in conjunction with the accompanying drawings and embodiments.

[0061] refer to figure 1 , a hand-eye calibration method for depth cameras based on CALTag and point cloud information, comprising the following steps:

[0062] Step 1), establish the mathematical model of the hand-eye calibration system:

[0063] In the hand-eye calibration system, according to the difference in the mutual orientation relationship between the camera and the robot, the hand-eye relationship can be divided into Eye-in-Hand and Eye-to-Hand models. The Eye-in-Hand model is to install the camera at the end of the robot, so the camera moves with the end during the robot's working process. The camera in the Eye-to-Hand model is fixed, and the camera remains stationary relative to the target while the robot is moving.

[0064] For the hand-eye system with the external fixation of the "eye", that is, the Eye-to-Hand model, a calibration bo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a depth camera hand-eye calibration method based on CALTag and point cloud information, and the method comprises the steps: building a mathematic model of hand-eye calibration,and obtaining a hand-eye calibration equation AX = XB; then, using a CALTag calibration board for replacing a traditional checkerboard calibration board so thatthe recognition precision of the pose of the calibration board is improved, and meanwhile calculating a matrix A in a hand-eye calibration equation; solving a matrix B by combining the obtained matrix A and the positive kinematics of themechanical arm, and a hand-eye calibration equation A * X = X * B is solved based on the Lie group theory; and finally, obtaining a calibration matrix more suitable for a three-dimensional visual scene based on a trust region reflection optimization iterative algorithm by utilizing the obtained point cloud depth information. The method can accurately determine the coordinate transformation of thepoint cloud coordinate system and the robot coordinate system, is high in grabbing precision, and is suitable for an application scene that a mechanical arm grabs an object in three-dimensional vision.

Description

technical field [0001] The invention relates to the technical field of machine vision, in particular to a hand-eye calibration method for a depth camera based on CALTag and point cloud information. Background technique [0002] In the application of industrial robots, the identification and grasping of target objects is the most commonly used application method of industrial robots in production. The robot system based on machine vision includes three parts: the vision part, the manipulator part and the working environment, and the calibration problem of the robot system is to solve the problem of the coordinate transformation relationship between the vision part and the manipulator part. The problem of the conversion relationship between the base coordinate system and the camera coordinate system is called the hand-eye calibration problem. The robot needs to determine the grasping pose of the robot according to the pose of the target object obtained by the camera. Since th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/80
CPCG06T7/85
Inventor 陶唐飞王伟徐佳宇杨兴宇徐光华郑翔
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products