Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Linear structure light fitting plane-based robot repeated positioning accuracy measurement method

A technology of repetitive positioning accuracy and linear structured light, applied in the field of computer vision, can solve problems such as noise pollution, and achieve the effects of non-contact measurement, good anti-interference performance, and fast measurement speed

Active Publication Date: 2020-03-06
SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
View PDF30 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to solve the serious noise pollution in the production environment and the influence of the robot's own uncertainty on the measurement accuracy, the present invention provides a method with fast measurement speed, strong robustness, and real-time measurement of the repetitive positioning accuracy of the robot.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Linear structure light fitting plane-based robot repeated positioning accuracy measurement method
  • Linear structure light fitting plane-based robot repeated positioning accuracy measurement method
  • Linear structure light fitting plane-based robot repeated positioning accuracy measurement method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0050] Such as Figure 4 As shown, the robot repetitive positioning accuracy measurement method based on the line structured light fitting plane of the present invention uses the planar cooperative target and the line structured light to fit the cooperative target plane equation, and then determines the coordinate axis and origin of the model coordinate system, and finally obtains the camera The rotation matrix and translation vector between the coordinate system and the model coordinate system. Specifically, two horizontal lines of structured light are used to irradiate the upper surface of the planar cooperation target, and its image coordinates are extracted; according to the geometric relationship between the camera coordinate system and the image coordinate system, the line structured light image coordinates are converted into camera syst...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a linear structure light fitting plane-based robot repeated positioning accuracy measurement method. The method comprises the steps of irradiating an upper surface of a planarcooperation target with two rays of horizontal linear structure light, and extracting image coordinate; converting linear structure light image coordinate to camera system coordinate according to a geometrical relation between a camera coordinate system and an image coordinate system, performing planar fitting, and determining a position of the upper surface of the cooperation target under the camera coordinate system; calculating a normal of the fitting plane, setting the normal as a z axis of a model coordinate system, and setting a linear structure light direction of the fitting plane as an x axis of the model coordinate system, wherein the z axis is crossed with the x axis to from a y axis of the model coordinate system; forming a cross light region on the upper surface of the planarcooperation target, wherein a cross point is an origin point of the model coordinate system; and determining a rotation matrix and a translation vector according to a geometrical relation between thecamera coordinate system and the model coordinate system. By the method, on-line, real-time, automatic and non-contact measurement of robot repeated positioning accuracy can be achieved, and the method is rapid in measurement speed and good in system flexibility.

Description

technical field [0001] The invention belongs to the field of computer vision, and in particular relates to a robot repeat positioning accuracy measurement method based on line structured light fitting plane. Background technique [0002] With the rapid development of my country's national economy, automated production has become the future development trend and direction. Using robots instead of manual to realize automatic loading and unloading not only saves production costs, but also improves production efficiency and safety factor, reduces labor intensity of workers, and becomes an ideal choice for more and more enterprises. [0003] In order to realize the automatic loading and unloading of the robot, it is required to have a high repeat positioning accuracy. The structured light measurement method has the characteristics of strong real-time performance and simple equipment, which has attracted more and more attention. For applications that require strict convenience su...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C11/02
CPCG01C11/02
Inventor 吴清潇欧锦军王爽朱枫郝颖明段红旭
Owner SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products