Visual positioning method and device based on two-dimensional code and computer readable storage medium

A visual positioning and two-dimensional code technology, applied in manipulators, program-controlled manipulators, manufacturing tools, etc., can solve problems such as large visual positioning errors, and achieve the effects of improving accuracy, reducing labor costs, and speeding up production rhythm.

Pending Publication Date: 2022-06-07
山东新松工业软件研究院股份有限公司
View PDF7 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of this, in order to solve the problem of introducing excessive visual positioning errors, the present invention provides a visual positioning method based on a two-dimensional code, which includes steps:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual positioning method and device based on two-dimensional code and computer readable storage medium
  • Visual positioning method and device based on two-dimensional code and computer readable storage medium
  • Visual positioning method and device based on two-dimensional code and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] figure 1 It is a flowchart of the visual positioning method based on two-dimensional code in the first embodiment of the present invention, such as figure 1 As shown, the present embodiment provides a two-dimensional code-based visual positioning method, the steps of which include:

[0045] S1. Establish a first coordinate system with the center point 2 of the two-dimensional code 1 as the origin;

[0046] S2, obtain the coordinates of the projection point of the center point of the grasping pose robot tool in the first coordinate system and the rotation angle of the gripper;

[0047] S3. In the image plane of the camera at the photographing and teaching point, a second coordinate system is established with the image center point 3 of the photographing and teaching point as the origin;

[0048] S4, obtain the coordinates of the center point 2 of the two-dimensional code 1 in the second coordinate system and the rotation angle of the two-dimensional code;

[0049] S5....

Embodiment 2

[0052] figure 2 It is a schematic diagram of the positional relationship between the origin of the first coordinate system, the origin of the second coordinate system and the projection point of the center point of the grasping pose robot tool in the second embodiment of the present invention, as shown in figure 2 As shown in this embodiment, preferably, the process of establishing the first coordinate system in S1 is as follows: take the center point of the two-dimensional code as the origin O of the coordinate system, the rightward direction of the two-dimensional code is the X-axis, and the downward direction of the two-dimensional code For the Y axis, establish a plane rectangular coordinate system XOY, that is, the first coordinate system.

[0053] In S2, measure the coordinates (x 0 , y 0 ) and the angle of rotation of the gripper 0 .

[0054] The process of establishing the second coordinate system in S3 is: in the image plane of the camera at the photographing an...

Embodiment 3

[0075] The difference between this embodiment and the first embodiment is that:

[0076] image 3 is the flow chart of the two-dimensional code-based visual positioning method in the third embodiment of the present invention; such as image 3 As shown, before executing S1, perform the following steps:

[0077] S10, establishing a digital model for the vision-guided grasping system;

[0078] S11. Seek a flat area where the QR code can be placed;

[0079] Preferably this area is a rectangular area 6 .

[0080] Vision-guided grasping systems include workpieces, robots, and fixtures, among others.

[0081] In addition, before executing S3, perform the following steps:

[0082] S30. Perform hand-eye calibration on a robot equipped with an industrial camera at the end;

[0083] S31 , the robot is controlled to place the workpiece to be grasped at a suitable position, and the robot is controlled to move to the photographing and teaching point.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of robot visual localization, and provides a visual localization method and device based on a two-dimensional code and a computer readable storage medium, and the method comprises the steps: S1, building a first coordinate system with a central point of the two-dimensional code as an original point; s2, the coordinates of a projection point of the center point of the grabbing pose robot tool in the first coordinate system and the rotation angle of the clamping jaw are obtained; s3, in the image plane of the camera at the photographing teaching point, establishing a second coordinate system by taking the image center point of the photographing teaching point as an original point; s4, acquiring the coordinate of the central point of the two-dimensional code in the second coordinate system and the rotation angle of the two-dimensional code; and S5, the actual pose of the projection point of the center point of the pose grabbing robot tool relative to the second coordinate system is calculated. Compared with the prior art, the method and the device have the beneficial effects that the visual positioning error can be reduced, the workload of a user can be greatly reduced, and the working efficiency is improved.

Description

technical field [0001] The invention belongs to the technical field of robot visual positioning, and in particular relates to a two-dimensional code-based visual positioning method, a two-dimensional code-based visual positioning device and a computer-readable storage medium. Background technique [0002] In the process of monocular vision positioning of machine vision, due to the ever-changing visual characteristics of the target to be identified, and the template matching algorithm and the target to be identified will produce affine transformation due to the change of perspective, which will introduce a certain visual positioning error. [0003] Therefore, there is an urgent need to design a two-dimensional code-based visual positioning method and device with small visual positioning error and high precision. SUMMARY OF THE INVENTION [0004] In view of this, in order to solve the problem that the introduced visual positioning error is too large, the present invention pr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16B25J9/00
CPCB25J9/1697B25J9/0081
Inventor 刘世昌邹风山陈亮孙铭泽万钇良毕丰隆
Owner 山东新松工业软件研究院股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products