Unlock instant, AI-driven research and patent intelligence for your innovation.

A Pose Accuracy Compensation Method for Industrial Robots Based on Deep Learning

An industrial robot and deep learning technology, applied in the field of industrial robot pose precision compensation based on deep learning, can solve the problems of uncompensated pose error, influence of precision compensation accuracy, weak interpretability of error prediction model, etc., and achieve improvement Accuracy, overcoming the need for a large number of sample data, and increasing the effect of interpretability

Active Publication Date: 2022-02-11
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing robot accuracy compensation methods can be roughly summarized as the following deficiencies: only the position error compensation of the robot target pose is realized, and the attitude error is not compensated; the sampling step size and distribution have a great influence on the accuracy of accuracy compensation; error prediction The interpretability of the model is weak

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Pose Accuracy Compensation Method for Industrial Robots Based on Deep Learning
  • A Pose Accuracy Compensation Method for Industrial Robots Based on Deep Learning
  • A Pose Accuracy Compensation Method for Industrial Robots Based on Deep Learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0058] see Figure 1-5 , in order to better understand the technical contents of the present invention, the specific embodiments are specifically cited and described as follows in conjunction with the accompanying drawings:

[0059] Step 1: Within the range of 0.4mx0.8mx0.5m of the working space of the end effector of the industrial robot, plan the theoretical pose sample data of 2000 different industrial robot configurations, and use binocular vision measurement e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for compensating position and posture accuracy of industrial robots based on deep learning. The invention combines the error similarity of industrial robots and proposes a deep learning model for compensating position and posture errors of industrial robots, including the construction of two deep beliefs The network-parallel model architecture uses the state characteristics of industrial robots and their corresponding pose error parameters as sample data to train the deep learning model by means of combination and comparison, and uses the error similarity of industrial robots as the network model supervision The additional features of learning, the deep learning model after training can more accurately predict and compensate the position and posture errors of industrial robots; the present invention combines the advantages of deep learning feature expression ability and statistical explainability As an additional feature of supervised learning, comparative information other than the robot state feature itself is introduced to improve the prediction accuracy of the deep learning model.

Description

technical field [0001] The invention relates to the technical field of industrial robot pose error compensation, in particular to a method for compensating the pose accuracy of industrial robots based on deep learning. Background technique [0002] With the implementation of the intelligent manufacturing strategy, industrial robots have been gradually used in high-end manufacturing due to their high efficiency, high flexibility, and high degree of automation. However, the absolute pose accuracy of industrial robots is far lower than the repetitive pose accuracy. As a result, industrial robots cannot meet the precision requirements of high-end manufacturing products, especially in the application scenarios of large-scale and complex structures combined with off-line programming. The low absolute pose accuracy has become the main obstacle restricting the application of industrial robots in intelligent manufacturing. Accuracy Compensation Technology [0003] With the advantage...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16G06N3/04G06N3/08
CPCB25J9/163B25J9/1664G06N3/084G06N3/045
Inventor 田威王伟廖文和李波李鹏程
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More