Mechanical paw grasping planning method based on deep projection and control device

A technology of manipulator claws and control devices, which is applied in manipulators, program-controlled manipulators, manufacturing tools, etc. It can solve problems such as slow planning speed, requirements for light stability, and unrealistic modeling, so as to improve overall efficiency and reduce online grabbing Effect of planning time, reducing quantity

Active Publication Date: 2017-05-31
HANGZHOU JIAZHI TECH CO LTD
View PDF3 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Analytical-based methods are superior to learning-based methods in terms of computational speed. However, for analytical-based methods, precise three-dimensional modeling of the gripper and the grasped objects is required, which is difficult for modeling a large number of objects to be grasped. Die is almost unrealistic
Especially for deformable objects, such as plush toys, it is impossible to build a usable static model
Although the learning-based method does not require precise modeling of the object to be grasped and the robotic gripper, due to the use of the sliding window method, the planning speed is far inferior to the analysis-based method. The input of the model has high requirements for light stability

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mechanical paw grasping planning method based on deep projection and control device
  • Mechanical paw grasping planning method based on deep projection and control device
  • Mechanical paw grasping planning method based on deep projection and control device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0085] Embodiment 1: It includes a first computing module and a control module.

[0086] The first computing module generates candidate capture pose samples according to the depth information of the current scene, and uses the trained grasp selection neural network to obtain the optimal grasp pose; the grasp selection neural network is used as a candidate capture Obtain the evaluation standard of the optimal grasping pose from the pose sample; the first calculation module includes the first acquisition unit, the first pose sample generation unit, and the grasping selection unit; the first acquisition unit is to obtain the depth information of the current scene, using The current scene depth information generates the coordinate system of the candidate capture pose, obtains the current scene depth information synthesized under the coordinate system of the candidate capture pose, and the first acquisition unit specifically obtains the current scene point cloud information; from th...

Embodiment 2

[0089] Embodiment 2: In addition to the modules included in Embodiment 1, the control device also includes a second computing module. The second computing module uses the scene depth information to generate positive and negative samples of the grasping pose offline, and performs the training of the grasping selection neural network. . The second operation module includes a second acquisition unit, a second pose sample generation unit and a training unit,

[0090] The second acquisition unit is to acquire the synthesized scene depth information, and the second acquisition unit specifically acquires the scene point cloud information; segment the point cloud information of the object to be captured and the pose information of the carrier from the scene point cloud information; The extracted point cloud information of the object to be grasped and the pose information of the carrier are used to construct the synthetic scene depth information. The second pose sample generation unit...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a mechanical paw grasping planning method based on deep projection and a control device. The mechanical paw grasping planning method mainly comprises the following steps: generating grasping position and posture positive and negative samples by using scene depth information to perform training on a grasping selection neural network; and generating a candidate grasping position and posture sample according to the current scene depth information, and obtaining the optimal grasping position and posture by means of the trained network. The control device comprises a first operation module and a control module. A mechanical paw is adjusted to the optimal grasping position and posture for grasping. The mechanical paw grasping planning method and the control device have the advantages that the size information and the like of the mechanical paw are combined, the grasping planning method can be suitable for mechanical paws of different types, modeling an object to be grasped is not needed, and the fast grasping planning method can be suitable for the sizes of different objects and paws, and is not affected by illumination variation.

Description

technical field [0001] The invention belongs to the field of robot assembly, and in particular relates to a depth projection-based manipulator grasping planning method and a control device. Background technique [0002] The existing grasping planning methods can be basically divided into analysis-based grasping planning methods and learning-based grasping planning methods. [0003] The analysis-based grasping planning method first defines some basic shapes such as cubes, spheres, cylinders, cones, etc., and defines their corresponding possible grasping poses in the grasping synthesis stage, and then uses shape primitives, Different methods, such as decomposition trees or minimum volume bounding boxes, decompose objects into the basic shapes mentioned above. Finally, according to the predefined candidate grasping poses, all the candidate grasping poses corresponding to each decomposed basic shape are taken out, and combined into the candidate grasping poses of the entire obj...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16B25J15/00
CPCB25J9/1669B25J15/0009
Inventor 熊蓉王鹏王越
Owner HANGZHOU JIAZHI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products