Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Grabbing point acquisition method of robot stable grasping object based on monocular vision

A technology of monocular vision and acquisition method, which is applied in the direction of instruments, computer components, character and pattern recognition, etc., can solve the problem of low capture success rate, achieve the improvement of capture success rate, fast and stable capture, and execution efficiency high effect

Active Publication Date: 2019-03-22
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF7 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the above-mentioned problems in the prior art, that is, in order to solve the problem that the robot has a low success rate of grasping unknown objects in an unstructured environment, one aspect of the present invention proposes a robot stabilization system based on monocular vision. The grabbing point acquisition method for grabbing objects includes:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Grabbing point acquisition method of robot stable grasping object based on monocular vision
  • Grabbing point acquisition method of robot stable grasping object based on monocular vision
  • Grabbing point acquisition method of robot stable grasping object based on monocular vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] In order to make the purpose, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings. Obviously, the described embodiments are part of the embodiments of the present invention, rather than Full examples. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0046] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, rather than to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of robot control, in particular to a method for acquiring grasping points of a robot steadily grasping an object based on monocular vision, aiming at solving the problem of low grasping success rate of a robot to an unknown object in an unstructured environment. The invention comprises the following steps: obtaining the color image of the object and the environment based on the monocular vision sensor, and extracting the plane edge contour of the object; Based on the planar edge contours of four-fingered parallel manipulator and object, constructing the environment constraint domain of robot grasping system, and obtaining the lowest point of environment attraction domain, and then a plurality of candidate grasping points. Inputting the candidate grabbingpoints of each group into a grabbing point quality evaluation network to obtain the grabbing point quality of each group of candidate grabbing points; Selecting the output of the grab point corresponding to the maximum quality of the grab point. The invention improves the rapidity, accuracy and reliability of the grasping point identification, and improves the grasping success rate of the robot tothe unknown object in the unstructured environment.

Description

technical field [0001] The invention belongs to the field of robot control, and in particular relates to a monocular vision-based method for obtaining a grasping point for a robot to stably grasp an object. Background technique [0002] The automatic identification and stable and fast grasping of objects by robots plays a very important role in the realization of industrial production automation and the wide application of robots in industry, and is the prerequisite for the completion of automatic assembly of robots. At present, according to the nature of the object to be grasped, the robot grasping system is mainly divided into two directions, one is the grasping system based on the object model, and the other is the grasping system with the unknown object model. The grasping system based on the object model needs to obtain the geometric model of the object to be grasped in advance, and the grasping accuracy is high, but the operating environment is very structured and the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46
CPCG06V20/10G06V10/44
Inventor 李小青钱扬李睿牛星宇刘永乐乔红
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products