Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pixel-level object grasping detection method and system for asymmetrical three-finger grasper

A detection method and detection system technology, applied in the field of robotics, can solve the problems that the asymmetrical three-finger gripper cannot exchange finger positions, and the true value of the gripping scheme cannot truly represent the grippable properties of objects, etc.

Active Publication Date: 2021-09-14
SHANDONG UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the asymmetrical three-finger gripper cannot interchange finger positions like the parallel plate gripper, multiple gripping models headed by a five-dimensional rectangular gripping frame will no longer be applicable
[0006] Most of the existing grasping detection methods refer to the object detection method in image processing, and the truth value of the grasping scheme used for training cannot truly represent the graspable attributes of the object

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pixel-level object grasping detection method and system for asymmetrical three-finger grasper
  • Pixel-level object grasping detection method and system for asymmetrical three-finger grasper
  • Pixel-level object grasping detection method and system for asymmetrical three-finger grasper

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] Since there is no grasping model specially suitable for the asymmetrical three-finger grasper, the present disclosure designs a oriented triangular grasping model in view of the fact that the grasper cannot perform symmetrical grasping. This disclosure uses traditional convolution and atrous convolution for grasping detection; in order to improve the scale invariance of the network, a spatial pyramid network is used to obtain feature maps with different receptive fields, and a feature fusion unit is used to fuse the low-level and high-level features of the network ; in order to complete end-to-end grasp detection, a grasp model detection unit is designed to directly output each parameter of the directed triangular grasp scheme.

[0037] The present disclosure proposes a pixel-level target grasping detection method for an asymmetric three-finger grasper. Aiming at the technical vacancy of grasping detection based on asymmetric three-finger gripper, this method uses the c...

Embodiment 2

[0104] Based on Embodiment 1, this embodiment provides a pixel-level target detection system for an asymmetric three-finger gripper, including an image acquisition device and a server:

[0105] an image acquisition device configured to acquire an original image containing an object that can be grasped by the asymmetric three-finger gripper, and transmit it to a server;

[0106] The server is configured to execute the steps of the pixel-level object grasping detection method for an asymmetric three-finger grasper described in Embodiment 1.

Embodiment 3

[0108] This embodiment provides a pixel-level target grabbing detection system for an asymmetrical three-finger grabber, including:

[0109] Image acquisition module: configured to acquire an original image containing a target to be captured;

[0110] Grasping plan labeling module: configured to mark the grabbing points for the target in the original image according to the constructed triangular grabbing model, and generate a grabbing plan;

[0111] Image data processing module: configured to perform data enhancement and clipping on each image and its annotation capture scheme to obtain a processed image;

[0112] Grasp detection module: configured to input the processed image to the trained grasp detection network constructed by deep convolutional neural network, and output the grasping scheme with the highest grasping confidence as the final directed triangular grasping Take the plan.

[0113] Among them, the grasping scheme labeling module: including the grasping point of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present disclosure proposes a pixel-level object grasping detection method and system for an asymmetrical three-finger grasper. The present disclosure designs a oriented triangular grasping model for the characteristic that the grasper cannot perform symmetrical grasping. Traditional convolution and hole convolution are used for grasping detection; in order to improve the scale invariance of the network, the spatial pyramid network is used to obtain feature maps with different receptive fields, and the feature fusion unit is used to fuse the low-level and high-level features of the network; Complete the end-to-end grasping detection, design the grasping model detection unit, directly output each parameter of the directional triangle grasping scheme, determine the grasping scheme, and effectively improve the detection accuracy.

Description

technical field [0001] The present disclosure relates to the technical field related to robots, and specifically relates to a pixel-level object grasping detection method and system of an asymmetrical three-finger grasper. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art. [0003] Grasp Detection (Grasp Detection) is a technology that obtains a grasping scheme that can be used for actual grasping operations for a specified robot grasper. In domestic and industrial scenarios, grasping objects from tables is a very important and challenging step for robots to perform autonomous or collaborative human-robot tasks. In general, robot grasping can be divided into three steps: grasping detection, trajectory planning and execution. Grasp detection means that the robot obtains the visual information of the target through RGB or RGBD cameras, and then uses...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/08B25J9/16B25J15/10
CPCB25J9/08B25J9/1661B25J9/1679B25J9/1697B25J15/103
Inventor 常发亮王德鑫李南君刘春生赵子健
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products