Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for grabbing object by mechanical hand based on image information

A technology of image information and manipulators, applied in the field of robotics, can solve problems such as high complexity of grasping algorithms, inaccurate positions of manipulator fingers, and object detachment

Inactive Publication Date: 2011-07-20
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF4 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the high complexity of the grasping algorithm based on model reconstruction, and in the process of use, due to the inaccurate position of the fingers of the manipulator, the object is separated from the manipulator and the grasping is unstable; and, the grasping method based on offline learning requires Training in advance to grasp the mapping function, the real-time performance of the algorithm is not good, etc., the purpose of the present invention is to provide a method for manipulators to grasp objects based on image information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for grabbing object by mechanical hand based on image information
  • Method for grabbing object by mechanical hand based on image information
  • Method for grabbing object by mechanical hand based on image information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be described in further detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0055] Such as figure 1 As shown, the method of quickly grasping objects by manipulators based on image information mainly uses four modules: image processing module 1, convex hull calculation module 2, minimum width calculation module 3 and contained grasping area calculation module 4. First, the image processing module 1 is used to extract the edge contour of the object from the image information; secondly, the convex hull calculation module 2 is used to fit the minimum convex polygon 21 surrounding the object contour; then, the minimum width calculation module 3 is used to calculate the minimum The minimum width of the convex polygon is 31; finally, according to the containment grasping area calculation module 4, the r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for grabbing an object by a mechanical hand based on image information. The outline of the object is extracted from the image information, and a containing grabbing manner which can ensure that the object can not escape from the fingers of the mechanical hand is calculated out according to the outline information. The method is characterized in that the direct calculation of the accurate position of the grabbing point is avoided by calculating a containing grabbing area, the object can be ensured not to escape from the mechanical hand and can be finally grabbed firmly. The method comprises the following steps of: firstly, extracting the edge outline of the object from the image information by using an image processing module; secondly, fitting out the minimum convex polygon surrounding the object outline by using a convex hull calculation module; thirdly, calculating out the minimum width of the minimum convex polygon by using a minimum width calculation module; and finally, calculating the relative position relation between the mechanical hand and the object according to a containing grabbing area calculation module when the mechanical hand grabs the object, and guiding the mechanical hand to grab the object.

Description

technical field [0001] The technical field of robots of the present invention relates to a method for grabbing an object by a manipulator based on image information. Background technique [0002] In industrial production activities, the use of manipulators to grab workpieces placed on workbenches or conveyor belts is the basic process for completing sorting, handling or assembly tasks. For example, in an automobile engine production line, it is necessary to use a manipulator to pick out connecting rods from a pile of messy workpieces and assemble them into the engine. On the bolt processing line, nuts that do not meet specifications need to be sorted from the conveyor belt. On the toy production line, toys of different shapes need to be moved into different packing boxes. In the field of life, there are a large number of practical service robots. These service robots need to intelligently grasp objects placed in different positions according to the user's instructions, su...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J13/00B25J13/08
Inventor 苏建华乔红区志财刘传凯
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products