Robot high-precision assembling method based on visual servo

A technology of visual servoing and assembly method, applied in the directions of manipulators, program-controlled manipulators, manufacturing tools, etc., can solve the problems of positioning deviation and inaccuracy, and achieve the effect of remarkable effect and avoiding errors.

Active Publication Date: 2021-06-04
ROKAE SHANDONG INTELLIGENT TECH CO LTD
View PDF10 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In some assembly scenarios that require high precision, simple vision-guided positioning cannot completely solve the problem
First of all, the calculation of the initial pose of the workpiece before being manipulated and the target pose after being manipulated in the robot coordinate system completely depends on the calibration results between the camera and the robot. The calibration results will be affected by the absolute position error of the robot and the visual feature points Calculation and other effects are not accurate; secondly, the workpiece may move slightly when it is grasped by the robot fixture, thus causing positioning deviation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot high-precision assembling method based on visual servo
  • Robot high-precision assembling method based on visual servo
  • Robot high-precision assembling method based on visual servo

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0039] The present invention proposes a high-precision robot assembly method based on visual servoing. The following hardware devices are used in the method: workbench, second camera CA, first camera CB, industrial computer, mechanical arm, workpiece and assembly groove, and visual marks. like figure 2 As shown, the workbench is a plane. One of the areas holds the workpieces to be assembled; the other holds the assembly slots. The robotic arm needs to accurately place the workpiece to be assembled in the assembly slot.

[004...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a robot high-precision assembling method based on visual servo. The robot high-precision assembling method comprises the steps that teaching and setting work is conducted; after the teaching and setting work is completed, a mechanical arm is moved to a grabbing area to grab a workpiece; the mechanical arm is moved to the position above a first camera, visual servo motion is conducted through the first camera and the mechanical arm, so that the position of the workpiece in the image is M_b, and the relative position T_d of the mechanical arm relative to the position T_b is recorded; a second camera is used for photographing and calculating the pose of an assembling groove on the image, and the expected pose M_e of a visual mark on the image is calculated according to the M_c; the mechanical arm is moved to the field of view of the second camera, visual servo motion is conducted through the second camera and the mechanical arm, and the pose of the visual mark in the image is made to be M_e; the relative pose T_d is converted to the pose under the coordinate system at the tail end of the mechanical arm at the moment, and the mechanical arm moves the pose; and the mechanical arm moves the relative pose T_c, namely the workpiece is accurately placed on the assembling groove.

Description

technical field [0001] The invention relates to the technical field of industrial robots, in particular to a high-precision robot assembly method based on visual servoing. Background technique [0002] With the continuous improvement of labor costs, there has been an upsurge of automation in the field of assembly lines. In modern automated production lines, industrial robots usually complete the "grab-and-place" assembly action. In order to complete the assembly task, the robot must know the initial pose of the workpiece before being manipulated and the target pose after being manipulated. [0003] In simple application scenarios, the initial pose and target pose of the workpiece are specified in advance, and the robot only operates according to a fixed program. [0004] In most practical application scenarios, the initial pose or target pose of the workpiece is not strictly fixed. In such cases, vision-guided positioning is an ideal solution. The robot perceives the cha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/1697
Inventor 袁顺宁张彪韩峰涛曹华庹华耿旭达李亚楠任赜宇
Owner ROKAE SHANDONG INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products