Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Grasping Planning Method for Dexterous Hands Based on Four-Level Convolutional Neural Networks

A convolutional neural network and dexterous hand technology, applied in the field of computer vision, can solve problems such as limited applications, inability to use grasping planning, and only consideration of gripper grasping planning, so as to improve grasping ability and strong generalization ability , Capture planning is simple and easy to operate

Active Publication Date: 2020-12-25
UNIV OF SCI & TECH OF CHINA
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In order to realize the dexterous hand grasping the object, the analysis method is to determine the grasping pose and gesture of the dexterous hand through the 3D model of the object, but it is usually difficult to obtain the 3D information of the object in the real environment, which greatly limits their application in the actual scene. Applications
Afterwards, empirical methods based on deep learning have been widely used in grasping planning, but most of these works only consider the grasping planning of simple grippers.
Since the dexterous hand cannot be closed directly like a gripper, grasp planning needs to take into account the grasping gestures of the dexterous hand, so they cannot be used in more complex dexterous hand grasp planning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Grasping Planning Method for Dexterous Hands Based on Four-Level Convolutional Neural Networks
  • A Grasping Planning Method for Dexterous Hands Based on Four-Level Convolutional Neural Networks
  • A Grasping Planning Method for Dexterous Hands Based on Four-Level Convolutional Neural Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] In this embodiment, the four-level convolutional neural network-based dexterous hand grasping planning method applied to the object grasping operation composed of a robot, a camera, and a target includes: acquiring grasping frame data sets and grasping gesture data Set, design a four-level convolutional neural network structure, obtain the depth map of the grasped part of the target, and determine the position and posture of the dexterous hand. Among them, for the four-level convolutional neural network, the first, second and third levels are used to detect the best grasping frame of the object, and obtain the depth map of the grasped part of the object; the fourth level network is based on the depth map of the grasped part and the dexterous hand The pose information of the dexterous hand is used to predict the grasping gesture of the dexterous hand. Specifically, proceed as follows:

[0044] Step 1: Obtain the grab frame dataset and grab gesture dataset:

[0045] Step ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dexterous hand grasping planning method based on a four-stage serial convolutional neural network. The four-stage convolutional neural network is trained, the parameters of the network are determined, and the grasping model of the dexterous hand is obtained; among them, in the proposed four-stage serial convolutional neural network, the first three-stage convolutional neural network is used to obtain the optimal position of the target. The best grasping frame; the fourth level is used to predict the grasping gesture of the dexterous hand, and obtain a variety of grasping features with a multi-input network, so as to predict the current state according to the image information of the grasped part of the target and the pose information of the dexterous hand under the grab gesture. The invention can realize fine grasping of unknown objects, so that the grasping by dexterous hands is not restricted by unknown objects, thereby improving the success rate of grasping by dexterous hands.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a grasping planning method for a dexterous hand based on a four-level convolutional neural network. Background technique [0002] As the basic function of robots, object grasping operation has always been an important research direction in the field of robotics. In general, grasp planning algorithms are divided into analytical and empirical methods. In order to realize the dexterous hand grasping the object, the analysis method is to determine the grasping pose and gesture of the dexterous hand through the 3D model of the object, but it is usually difficult to obtain the 3D information of the object in the real environment, which greatly limits their application in the actual scene. Applications. Empirical methods based on deep learning have been widely used in grasping planning, but most of these works only consider grasping planning for simple grippers. S...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/73G06T7/50G06N3/04G06N3/08B25J9/16
CPCG06T7/73G06T7/50G06N3/08B25J9/1669G06T2207/10004G06T2207/10024G06T2207/20081G06T2207/20084G06T2207/30108G06N3/045
Inventor 尚伟伟宋方井丛爽
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products