Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for detecting the grabbing position of a robot target object

A technology for target objects and robots, which is applied in the field of robot target object grasping position detection, and can solve problems such as low efficiency and inability to segment objects.

Active Publication Date: 2019-04-19
CLOUDMINDS SHANGHAI ROBOTICS CO LTD
View PDF12 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the process of realizing the present invention, it is found that the automatic segmentation of the target object in the prior art is based on the depth image, which cannot be segmented for objects in complex backgrounds, and the efficiency is low when automatically locating the grabbing point

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for detecting the grabbing position of a robot target object
  • Method for detecting the grabbing position of a robot target object
  • Method for detecting the grabbing position of a robot target object

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0061] An embodiment of the present application provides a non-volatile computer storage medium, the computer storage medium stores at least one executable instruction, and the computer executable instruction can execute a robot target object grasping in any of the above method embodiments The method of location detection.

[0062] figure 1 It is a flow chart of an embodiment of a method for detecting a grasping position of a robo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of robot autonomous grabbing, and particularly discloses a method and device for detecting the grabbing position of a robot target object, computing equipment and a computer storage medium, and the method comprises the steps: collecting a target RGB image and a target Depth image of a target object at different visual angles; Inputting each target RGB image into a target object segmentation network for calculation to obtain an RGB pixel region of a target object in the target RGB image and a Depth pixel region of the target object in the target Depth image; Inputting the RGB pixel area of the target object into the optimal grabbing position generation network to obtain an optimal grabbing position for grabbing the target object; Inputting the Depth pixel area of the target object and the optimal grabbing position into a grabbing position quality evaluation network, and calculating the score of the optimal grabbing position; And selecting theoptimal grabbing position corresponding to the highest score as a global optimal grabbing position of the robot. Therefore, the robot can automatically grab the target object at the optimal grabbingposition by utilizing the scheme of the invention.

Description

technical field [0001] The embodiments of the present invention relate to the field of robot autonomous grasping, and in particular to a method, device, computing device, and computer storage medium for detecting the grasping position of a robot target object. Background technique [0002] In the field of intelligent robots, robot autonomous grasping is a key capability of intelligent robots, especially for home service robots and industrial robots. For the research problem of robot autonomous grasping, traditional solutions mainly include two methods: geometric analysis method and data-driven reasoning method; the geometric analysis method has high artificial complexity, and the data-driven reasoning method performs poorly in complex scenes. [0003] With the advent of deep learning, great breakthroughs have been made in robotic autonomous grasping research. Applying the deep learning algorithm, the current technology can realize that when the robot autonomously grabs the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/10G06T7/50G06T7/90G06N3/04G06N3/08
CPCG06N3/08G06T7/10G06T7/50G06T7/90G06N3/045G06T7/11G06T7/194G06T2207/10024G06T2207/10028G06T2207/20081G06T2207/20084G06T7/70B25J9/1697B25J9/1669G05B2219/39484G06T1/0014
Inventor 杜国光王恺廉士国
Owner CLOUDMINDS SHANGHAI ROBOTICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products