Mechanical arm vision grabbing system and method based on self-supervised-learning neural network

A neural network and supervised learning technology, applied in the field of robotic arms, can solve the problems of not taking into account the important factors affecting the actual mass distribution of the grasped object, misestimation, etc., and achieve the effect of improving the success rate and accurately grasping the pose

Active Publication Date: 2019-05-03
INST OF ELECTRONICS CHINESE ACAD OF SCI
View PDF8 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, this type of method can improve the grasping of objects with missing textures, but it is prone to misestimation for objects whose outlines are partially occluded
In addition, existing methods are only based on machine vision, but do not take into account the actual mass distribution of the grasped object, which is an important factor in actual grasping

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mechanical arm vision grabbing system and method based on self-supervised-learning neural network
  • Mechanical arm vision grabbing system and method based on self-supervised-learning neural network
  • Mechanical arm vision grabbing system and method based on self-supervised-learning neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present disclosure provides a robotic arm visual grasping system and method based on a self-supervised learning neural network, including acquiring a visual image of an object to be grasped, and neural network regression to estimate the approximate position of the grasped object and the precise estimation of the grasping attitude. The convolutional neural network first recognizes the object and regresses to obtain the approximate outline position of the grasped object in the visual image. Then, sample different grasping positions and grasping angles in the object outline, score each position and angle after passing through the fully connected layer of the neural network, and use the position and angle corresponding to the highest score as the selected position and angle for the robotic arm to grasp the object. Precise gesture. Among them, the weight of the neural network is obtained by the self-supervised training of the manipulator, considering the actual density d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a mechanical arm vision grabbing system based on the self-supervised-learning neural network. A field depth camera, a living example dividing module, a posture estimation neuralnetwork module, a three-dimensional posture obtaining module and a mechanical arm control module are included. The field depth camera outputs colorful images to the living example dividing module andoutputs field depth images to the three-dimensional posture obtaining module. The living example dividing module inputs at least one of classification, frame body or dividing information into the posture estimation neural network module. The posture estimation neural network module then outputs the plane posture to the three-dimensional posture obtaining module, the plane posture is fused with the field depth image, and a three-dimensional posture is obtained. The mechanical arm control module obtains the three-dimensional posture and achieves mechanical arm grabbing operation according to the three-dimensional posture. By means of the mechanical arm vision grabbing system, the rough outline position of an object needing to be grabbed can be recognized, and further accurate grabbing position and grabbing angle calculation can be achieved through a neural network with a full connection layer.

Description

technical field [0001] The present disclosure relates to the field of robotic arm technology, in particular to a robotic arm visual grasping system and method based on a self-supervised learning neural network. Background technique [0002] The robotic arm is an automatic operating device that can imitate certain movement functions of the human hand and arm to grab, carry objects or operate tools according to a fixed program. The mechanical arm can replace the heavy labor of people to realize the mechanization and automation of production, and can operate in a harmful environment to protect personal safety, so it is widely used in machinery manufacturing, metallurgy, light industry and atomic energy and other fields. [0003] In industry, the grasping operation of manipulators mostly adopts the traditional teaching method. However, for a new operating object or a new operating environment, it is necessary to re-teach the manipulator manually. With the development and appli...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
Inventor 舒心刘昶李彤
Owner INST OF ELECTRONICS CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products