Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A visual control method of industrial manipulator based on deep convolutional neural network

A deep convolution and neural network technology, applied in the field of visual control of industrial manipulators based on deep convolutional neural networks, can solve the problems of low sensor accuracy, high complexity, and inability to use industrial production

Active Publication Date: 2019-05-14
SOUTH CHINA UNIV OF TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But the problem is that these sensors are not very accurate and cannot be used in industrial production in the short term
[0009] 3. Voice teaching: using predefined voice commands to manipulate the movement of industrial robots, the problem is that the operability accuracy is poor, and fine work cannot be performed well
[0011] The disadvantages of the above four ways of using industrial robots are that industrial robots need to operate according to predefined programs, require professionals to maintain, and are highly complex when applied to new tasks.
Difficulties in deployment and implementation have largely limited the growth of the robotics industry

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A visual control method of industrial manipulator based on deep convolutional neural network
  • A visual control method of industrial manipulator based on deep convolutional neural network
  • A visual control method of industrial manipulator based on deep convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0059] The present invention will be further described below in conjunction with specific examples.

[0060] Take the six-degree-of-freedom redundant industrial robot arm as an example, such as figure 1 As shown, the industrial manipulator vision control method based on the deep convolutional neural network of this embodiment specifically includes the following steps:

[0061] 1) Acquisition and preprocessing of target object visual information

[0062] Place the target object on the workbench, use the CCD camera to collect the color information pictures of the target object on the workbench in different postures, different positions, and different directions, and manually mark the ideal grasping pose points. The purpose is to fully obtain the visual information representation of the target object and mark the ideal grasping position, so as to fully represent the actual distribution of the target object in various situations. There can be many kinds of target objects, such a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual control method of an industrial manipulator based on a deep convolutional neural network, comprising the steps of: 1) collecting and preprocessing visual information of a target object; 2) training and adjusting a deep convolutional neural network model; 3) verifying the model and save the model. The invention combines the deep convolutional neural network to extract the ideal grasping position of target objects with different postures, which improves the applicable range of the system, thereby overcoming the problem of poor recognition of specific target objects by traditional visual control, and effectively simplifying the difficulty of using industrial manipulators , providing a new method for industrial manipulator control, with good scalability.

Description

technical field [0001] The invention relates to the field of industrial manipulators, in particular to a visual control method for industrial manipulators based on a deep convolutional neural network. Background technique [0002] In industrial production, industrial robotic arms can replace manpower for simple and repetitive tasks, such as: picking, assembling, welding, packaging, beating, cutting, grinding, dragging and other production operations. Especially in dangerous and harsh working environments, robot technology is used to reduce potential safety risks. Research on robot technology is an important way to realize intelligent manufacturing and reduce production costs of enterprises. [0003] "Motion planning" and "task determination" are two key technologies for industrial robotic arms. "Motion planning" can be divided into two parts: path planning and trajectory generation. The purpose of path planning is to find a series of non-interfering path points for the in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/46G06K9/62G06K9/66G06N3/08B25J9/16
CPCG06N3/08B25J9/1664G06V30/194G06V10/56G06F18/214
Inventor 皮思远肖南峰
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products