Robot intelligent grabbing method based on digital twinning and deep neural network

A deep neural network and robot intelligence technology, applied in biological neural network models, neural architectures, manipulators, etc., can solve problems such as difficulty in grasping work and difficult to meet the needs of intelligent production, to ensure the success rate, improve learning ability and generalization. The ability to improve the ability to ensure the accuracy of the effect

Active Publication Date: 2021-02-02
ZHEJIANG UNIV
View PDF7 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, with the trend and development of Industry 4.0, the above methods have revealed the following problems: robots are not only required to be able to perform repetitive tasks, but are also expected to be able to complete complex tasks to a certain extent and have the ability to respond to environmental changes
When the posture of the object is messy, the grasping work will become quite difficult, that is, the traditional open-loop control of the robotic arm has basically zero resistance to environmental changes, and it is difficult to cope with the changing environment and Smart Production Requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot intelligent grabbing method based on digital twinning and deep neural network
  • Robot intelligent grabbing method based on digital twinning and deep neural network
  • Robot intelligent grabbing method based on digital twinning and deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The present invention will be described in detail below with reference to the accompanying drawings and preferred embodiments, and the purpose and effect of the present invention will become clearer. It should be understood that the specific embodiments described here are only used to explain the present invention and are not intended to limit the present invention.

[0034] Such as figure 1 According to the robot intelligent grasping method based on the digital twin and deep neural network of the present invention, the robot intelligent grasping includes a physical grasping environment, a virtual grasping judgment environment and a grasping decision neural network.

[0035] Such as figure 2 As shown, the physical grasping environment includes a physical robot, a two-finger parallel adaptive gripper, a depth camera, and a collection of objects to be grasped; the robot and the two-finger parallel adaptive gripper are the main actuators for grasping, and are responsible ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot intelligent grabbing method based on digital twinning and a deep neural network. The robot intelligent grabbing method comprises a physical grabbing environment, a virtual recognition environment and a core neural network part. The physical environment is composed of a depth camera, a robot, a mechanical claw and a grabbed object and is a main executing mechanism for grabbing. The virtual recognition environment is composed of a point cloud file constructed by the depth camera and related postures of a robot and a claw, and is a virtual environment set of a robot state, a mechanical claw position, a camera posture and an object placement position. The core neural network comprises a grabbing generation network and a grabbing recognition network, and the grabbing mode is sampled and judged to generate an optimal grabbing posture. According to the robot intelligent grabbing method, the optimal grabbing position and posture can be rapidly and efficiently judged based on the color-depth image collected by the camera.

Description

technical field [0001] The invention belongs to the field of digital twin intelligent manufacturing, and in particular relates to a robot intelligent grasping method based on digital twin and deep neural network. Background technique [0002] Digital Twin digital twin technology: it makes full use of data such as physical models, sensor updates, and operation history, integrates multi-disciplinary, multi-physical quantities, multi-scale, and multi-probability simulation processes, and completes the mapping of the physical world in the virtual world, thereby reflecting the corresponding The whole life cycle process of physical equipment. The digital twin can be regarded as a digital mapping system of one or more important and interdependent equipment systems, which is the link of interaction and integration between the physical world and the virtual world. [0003] With the development of Industry 3.0, the initial automated robots undertake repetitive, boring and low-intelli...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16G06N3/04
CPCB25J9/1612B25J9/1602G06N3/045
Inventor 胡伟飞王楚璇刘振宇谭建荣
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products