Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot grabbing method, terminal and computer readable storage medium

A robot and the same technology, applied in the field of computer vision, can solve the problems of grasping failure, the robot cannot grasp objects, and the collected images are inaccurate, so as to achieve the effect of improving the success rate, improving the success rate of grasping, and ensuring stability

Active Publication Date: 2019-09-20
CLOUDMINDS SHANGHAI ROBOTICS CO LTD
View PDF9 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The inventor found that there are at least the following problems in the related technology: currently, the robot determines the grasping position according to the data of the collected image, and directly grasps according to the determined grasping position. Due to objective reasons, the collected image is not accurate, which leads to the robot The object cannot be grasped stably, resulting in the failure of grasping

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot grabbing method, terminal and computer readable storage medium
  • Robot grabbing method, terminal and computer readable storage medium
  • Robot grabbing method, terminal and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] In order to make the objectives, technical solutions, and advantages of the embodiments of the present invention clearer, the following describes the embodiments of the present invention in detail with reference to the accompanying drawings. However, a person of ordinary skill in the art can understand that in each embodiment of the present invention, many technical details are proposed for the reader to better understand the present application. However, even without these technical details and various changes and modifications based on the following embodiments, the technical solution claimed in this application can be realized.

[0027] The first embodiment of the present invention relates to a method of robot grasping. The robot grasping method is applied to a robot, and the robot can be a separate robotic arm or an intelligent robot with a grasping arm. The specific process of the robot grasping method is as follows figure 1 Shown, including:

[0028] Step 101: Acquir...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention relates to the field of computer vision and discloses a robot grabbing method, a terminal and a computer readable storage medium. The robot grabbing method comprises: obtaining the first grabbing position information of a to-be-grabbed object in a first image and the second grabbing position information of the to-be-grabbed object in a second image, wherein the second image is acquired in a preset radius range taking the acquisition position of the first image as a center; judging whether a first grabbing position and a second grabbing position are in the same position or not according to the first grabbing position information and the second grabbing position information; and executing grabbing operation if the first grabbing position and the second grabbing position are in the same position. In the embodiment, the predicting stability of the grabbing positions can be ensured, and grabbing success probability is increased.

Description

Technical field [0001] The embodiments of the present invention relate to the field of computer vision, and in particular, to a method, terminal, and computer-readable storage medium for robot grasping. Background technique [0002] With the continuous advancement of technology, intelligent robots have appeared. Intelligent robots usually need to interact with the environment, for example, to grasp objects in the surrounding environment. [0003] Common robot grasping methods mainly include: geometric analysis methods and data-driven methods. The geometric analysis method is to analyze the geometric structure of the object in the image, randomly sample the grasping positions, and determine whether each grasping position meets the force sealing condition, so as to determine a reasonable grasping position. The data-driven method relies on the data of the known grasping position, and directly or indirectly infers the grasping position of the current object through the machine learni...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16B25J15/00G06K9/00G06N3/04G06N3/08
CPCB25J9/1661B25J9/1692B25J9/1697B25J15/00G06N3/08G06V20/10G06N3/045
Inventor 杜国光王恺廉士国
Owner CLOUDMINDS SHANGHAI ROBOTICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products