Hand-eye calibration method based on deep learning

A hand-eye calibration and deep learning technology, which is applied in the field of hand-eye calibration based on deep learning, can solve the problems that the corner points cannot obtain the position information of the calibration object, consumes a long time, and requires a large amount of calculation. Robust effect

Pending Publication Date: 2021-08-10
FUJIAN QUANZHOU HIT RES INSTIUTE OF ENG & TECH
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are still some defects in their actual use: the former needs to obtain the center position information of the current calibration object and the image size of the calibration object through corner point calculation. If the field of view of the camera is too small, it will be difficult to extract the corner points, that is, the position information of the calibration object cannot be obtained, and the robustness is poor; the latter solves the cantilever effect of the robotic arm through three-point calibration, and three-point calibration is required before each calibration. Manual intervention by staff with some expertise is still required

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hand-eye calibration method based on deep learning
  • Hand-eye calibration method based on deep learning
  • Hand-eye calibration method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Such as figure 1 As shown, the hand-eye calibration method based on deep learning includes the following steps:

[0035] A. Collect training images containing calibration objects, and mark the calibration object areas in each training image; specifically: place the calibration objects in different scenes, and adjust the calibration object positions to shoot different angles of view of the calibration objects ( For example, the calibration object is located in the middle of the training image, and part of the calibration object is located in the training image), by adjusting the camera model, shooting the calibration object in different fields of view to obtain a sufficient number of training images, and using image labeling tools (such as labelimage software) to mark out the calibration object area in each training image; the calibration object can be an object with a special mark, which is a prior art;

[0036] B. Use the training image to train the deep learning mode...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a hand-eye calibration method based on deep learning. The method comprises the following steps: collecting and marking a training image containing a calibration object; training a yolo3 model; setting a moving step length S and a distance threshold value D; enabling a control device to randomly adjust the position posture of the mechanical arm, and enabling a camera to shoot and obtain an adjusted image; and calling the trained yo3 model to analyze the adjustment image, if the size of the calibration object is not smaller than half of the view field of the camera and the actual distance is not larger than the distance threshold value D, shooting the calibration image and recording the position posture data of the mechanical arm at the moment, otherwise, shooting the adjustment image again after adjusting the mechanical arm according to the moving step length S, shooting a calibration image again; and after the image is calibrated to meet the required number, making hand-eye calibration matrix calculation. Manual participation is not needed in the process, manpower and material resources are saved, the calculation amount is small, the operation speed is high, the shooting requirement for the calibration image is low, the result accuracy is high, and robustness is better.

Description

technical field [0001] The invention relates to a hand-eye calibration method based on deep learning. Background technique [0002] With the continuous improvement of the level of industrialization, the application of vision-guided technology in the field of industrial robots is becoming more and more extensive. Through vision-guided industrial robots, the robot can not only perform preset actions, but also locate the position of the grasped target through machine vision. Perform a soft grip. The hand-eye calibration of industrial robots is the basis for realizing machine vision guidance, that is, the conversion matrix between the camera coordinate system and the robot coordinate system is determined through hand-eye calibration. Most of the existing hand-eye calibration methods use manual intervention for data collection, that is, by manually controlling the movement of the robotic arm, the calibration object is located in a suitable position, and the posture of the roboti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/66G06T7/80G06K9/62B25J9/16
CPCG06T7/80G06T7/66B25J9/1697B25J9/16B25J9/1628G06F18/214
Inventor 林文伟李瑞峰罗冠泰张陈涛汤思榕梁培栋赵紫阳
Owner FUJIAN QUANZHOU HIT RES INSTIUTE OF ENG & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products