Check patentability & draft patents in minutes with Patsnap Eureka AI!

Industrial robot debugging method based on combination of natural language and computer vision

An industrial robot and computer vision technology, applied in neural learning methods, software testing/debugging, semantic analysis, etc., can solve problems that cannot be applied to the production environment, poor task code robustness, and debugging results that cannot be applied to the on-site environment, etc., to achieve Improve development efficiency, solve time-consuming problems, and reduce deployment time

Pending Publication Date: 2022-07-01
HANGZHOU DIANZI UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the robot debugging method based on the current deep learning model has shortcomings such as poor robustness of the task code, and the debugging results cannot be applied to the field environment.
The debugging method based on this method cannot be applied to the production environment of the factory

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Industrial robot debugging method based on combination of natural language and computer vision
  • Industrial robot debugging method based on combination of natural language and computer vision
  • Industrial robot debugging method based on combination of natural language and computer vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The present invention is further analyzed below in conjunction with specific embodiments.

[0021] An industrial robot debugging method based on the combination of natural language and computer vision, such as figure 1 Include the following steps:

[0022] Step (1), generate semantic information

[0023] 1-1 Input the natural language description describing the robot action code into the word2vec network to generate text embedding; the details are as follows:

[0024] The natural language instruction X={x composed of i natural language words i |i=1,2,...,n} Generate text embedding vector matrix E={e through word2vec network i |i=1,2,…,n}∈R L×C ;

[0025] E=word2vec(X) (1)

[0026] where x i represents the ith natural language word, e i Represents the i-th text embedding vector, word2vec() represents the word2vec network function, L is the number of text embeddings, and C is the embedding dimension;

[0027] 1-2 Use the text embedding vector matrix generated in ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an industrial robot debugging method based on combination of natural language and computer vision. According to the method, semantic information is generated from natural language description through a word2vec network and a linear layer. And industrial robot environment features are extracted through a three-dimensional cyclic convolutional neural network. And inputting the environment features into a long short-term memory network to generate an intermediate context, and obtaining API recommendation through a recurrent neural network (RNN) by taking the intermediate context as the input of a GRU network. A text generated by natural language description through a word2vec network is embedded and input into a long short-term memory network encoder and a long short-term memory network decoder, an AST construction action sequence is output, and API recommendation and the construction action sequence are combined to generate an industrial robot debugging code. And adding the debugging code into a robot program editor to complete debugging. The development efficiency of robot debugging is effectively improved, and the deployment time of a robot production line in an industrial environment is shortened.

Description

technical field [0001] The invention belongs to the technical field of industrial robots, in particular to an industrial robot debugging method based on the combination of natural language and computer vision. Background technique [0002] In recent years, with the country's advocacy of smart factories, the manufacturing industry has begun to use robot technology on a large scale to assist production, and the concept of intelligent manufacturing has entered the stage of comprehensive promotion from popularization. As a major component of smart factories, robots can help improve factory productivity, perform operations that workers cannot, and be able to quickly adapt to new production needs. [0003] At present, the debugging methods of industrial robots on the market can be divided into online debugging and offline debugging. Online debugging requires the user to control the robot to complete the specified actions and save them. By running these specified actions, the opera...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F11/36G06F8/41G06F40/30G06N3/04G06N3/08
CPCG06F11/3624G06F11/3664G06F8/447G06F40/30G06N3/08G06N3/044G06N3/045
Inventor 胡海洋李川豪陈洁李忠金
Owner HANGZHOU DIANZI UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More