Robot object grabbing system, method and device

A technology for robots and objects, applied in the field of robots, can solve the problems of low object grasping efficiency, high hardware cost, and appearance requirements of objects

Pending Publication Date: 2020-07-07
ALIBABA GRP HLDG LTD
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] This application provides a robot grabbing object system to solve the problems of high hardw

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot object grabbing system, method and device
  • Robot object grabbing system, method and device
  • Robot object grabbing system, method and device

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0092] Please refer to figure 1 , which is a schematic structural diagram of an embodiment of a robot grabbing object system provided by the present application. In this embodiment, the system includes a server 1 and a robot 2 .

[0093] The server 1 is used to obtain a training data set; learn an object grasping information recognition model from the training data set, and the model includes an object grasping feature extraction subnetwork, an object category recognition subnetwork and a pose parameter recognition subnetwork Network; send the model to Robot 2.

[0094] The training data set includes a plurality of training data. The training data includes: the environment image for training, the category of at least one object in the environment image for training, and the corresponding relationship between the pose parameters of the at least one object relative to the camera coordinate system.

[0095] In the embodiment of the present application, in order to distinguish ...

no. 2 example

[0147] Please refer to Figure 5 , which is a flow chart of an embodiment of a method for a robot to grab an object provided in this application. The subject of execution of the method includes a robot. Since the method embodiments are basically similar to the system embodiments, the description is relatively simple, and for relevant parts, please refer to part of the description of the system embodiments. The method embodiments described below are illustrative only.

[0148] A method for grabbing an object by a robot provided in the present application includes:

[0149] Step S501: Collect the current environment image through the image collection device.

[0150] Step S503: extracting object grasping features from the current environment image through the object grasping feature extraction sub-network included in the object grasping information recognition model.

[0151] Step S505: Obtain the category of at least one object in the current environment image according to ...

no. 3 example

[0171] see Figure 7 , which is a schematic diagram of an embodiment of the robot grasping object device of the present application. Since the apparatus embodiment is basically similar to the method embodiment, the description is relatively simple, and reference may be made to part of the description of the method embodiment for related parts. The apparatus embodiments described below are merely illustrative.

[0172] The present application further provides a robot grasping object device, including:

[0173] An image acquisition unit 701, configured to acquire an image of the current environment through an image acquisition device;

[0174] The feature extraction unit 702 is used to extract the object grasping feature from the current environment image through the object grasping feature extraction sub-network included in the object grasping information identification model;

[0175] The object category identification unit 703 is configured to obtain the category of at lea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot object grabbing system, method and device, a robot and electronic equipment. The robot acquires a current environment image through the image acquisition device, and extracts object grabbing features from the current environment image by using an object grabbing feature extraction sub-network included in the object grabbing information identification model; the category of at least one object in the current environment image is obtained according to the object grabbing features through an object category recognition sub-network of the model; through a pose parameter identification sub-network of the model, pose parameters of at least one object in the current environment image relative to a camera coordinate system are obtained according to the object grabbing features; the pose parameters under the camera coordinate system are converted into pose parameters under a world coordinate system; and a target object is captured according to the pose parametersunder the world coordinate system and the type of at least one object in the current environment image. By means of the processing mode, the hardware cost of the robot can be effectively reduced, andthe grabbing accuracy and efficiency are improved.

Description

technical field [0001] The present application relates to the field of robot technology, in particular to a robot grabbing object system, a robot grabbing object method and device, an object grabbing information recognition model building method and device, a robot, and electronic equipment. Background technique [0002] With the development of the service mobile robot industry, more and more robots are widely used in people's daily work and life, for example, food delivery robots in restaurants, express delivery robots, industrial parts assembly robots and so on. [0003] At present, several commonly used methods for robots to grasp objects and their advantages and disadvantages are as follows. [0004] Method 1. This method is an intelligent control method for manipulating a multi-point mapping of a carrier robot arm. This method has the following disadvantages: 1) Due to the need to predict each joint on the robot arm, the complexity is high and the maneuverability is no...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/70G06K9/00B25J9/16B25J19/04
CPCG06T7/70B25J9/1612B25J19/04G06V20/10
Inventor 伊威李名杨古鉴邵柏韬
Owner ALIBABA GRP HLDG LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products