Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional grabbing platform based on deep learning and grabbing method

A deep learning, three-dimensional technology, applied in the field of robotic arm grasping, can solve problems such as cost increase, increase the complexity of the grasping process, and can not adapt to different postures of objects, achieve low cost, improve efficiency and replaceability, good The effect of precision and real-time

Active Publication Date: 2020-04-03
ZHEJIANG UNIV +1
View PDF12 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The above methods have shortcomings. The traditional method of grasping with a robotic arm using teaching can only grasp a single object, and cannot adapt to different postures of objects in complex scenes. At the same time, as the number of sensors increases, the cost also increases.
Traditional grasping methods based on machine vision manipulators often only utilize two-dimensional information, while ignoring three-dimensional structural information. For example, a patent application with a publication number of CN106003119A discloses an object grasping method of a suction manipulator and an object grasping system and The patent application with publication number CN104048607A discloses a visual recognition and grasping method of a mechanical arm
[0005] In addition, hand-eye calibration and manual feature extraction increase the complexity of the grasping process
Therefore, from the perspective of data processing and the complexity of manual processing features, these methods do not have good real-time and convenience.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional grabbing platform based on deep learning and grabbing method
  • Three-dimensional grabbing platform based on deep learning and grabbing method
  • Three-dimensional grabbing platform based on deep learning and grabbing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, and do not limit the protection scope of the present invention.

[0025] figure 1 Schematic diagram of the structure of the three-dimensional grasping platform based on deep learning provided by the embodiment of the present invention. see figure 1 , The deep learning-based three-dimensional grasping platform provided by the embodiment includes a control system 101 , a belt system 102 , a robot system 103 , an acquisition system 104 , and an object category and pose recognition system 105 .

[0026] Among them, the acquisition system 104 is controlled by the control system 101 to acquire the original three-dimensional point cloud dat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional grabbing platform based on deep learning and a grabbing method. The three-dimensional grabbing platform comprises a control system, a driving belt system, arobot system, an acquisition system and an object category and pose recognition system, wherein the acquisition system is used for acquiring and preprocessing original three-dimensional point cloud data in a visual field to obtain three-dimensional point cloud data of an object and output the three-dimensional point cloud data of the object to the object category and pose recognition system, theobject category and pose recognition system is used for recognizing the three-dimensional point cloud data of the object by using an object category and pose recognition model constructed based on a deep learning network, determining object category and pose information, and outputting the object category and pose information to the control system, the control system outputs a control instructionto the acquisition system so as to control to acquire the original three-dimensional point cloud data in the field of view, a moving path is planned according to a recognition result output by the object category and pose recognition system, and a control instruction is output to the robot system according to the moving path to control a robot to grab the object.

Description

technical field [0001] The present invention relates to the technical field of mechanical arm grasping, more specifically, to a three-dimensional grasping platform and grasping method based on deep learning. Background technique [0002] With the increasing cost of labor, using robotic arms instead of manual sorting to realize object recognition and grasping has always been the focus of attention. For example, in the field of logistics, robotic arms are used to sort packages, and in the industrial field to realize functions such as loading and unloading. [0003] The traditional grasping method of the robotic arm using teaching relies on a variety of sensors such as laser sensors, travel switches, etc. to ensure good repeat positioning accuracy. By setting the position to be reached by the robotic arm in advance, the object can be grasped according to a certain grasping rhythm. The traditional grasping method based on machine vision manipulator obtains the coordinates of t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/1602B25J9/163B25J9/1664
Inventor 傅建中何权吴森洋王可钦褚建农
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products