Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual guiding-based robot workpiece grabbing method

A visual guidance and robot technology, applied in the field of robot visual recognition and positioning, can solve the problems of increasing calculation, reducing efficiency, missing capture, etc., and achieve the effect of reducing data volume, low error rate, and eliminating interference

Inactive Publication Date: 2019-10-11
TIANJIN POLYTECHNIC UNIV
View PDF7 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The easiest way to locate a known target in an image is to search for the same pixel distribution area, which requires strict restrictions on the pose and lighting conditions of the target workpiece, which is very difficult in the actual production environment. Difficult to achieve; at the same time, in order to avoid the situation that the workpiece is missed, the camera is generally set to take multiple shots of the same target workpiece, which will cause the robot control system to obtain repeated information, increase the amount of calculation, and reduce efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual guiding-based robot workpiece grabbing method
  • Visual guiding-based robot workpiece grabbing method
  • Visual guiding-based robot workpiece grabbing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] Below in conjunction with accompanying drawing, the present invention is described in further detail:

[0057] The illumination change and rotation of the workpiece itself will change its gray level distribution in the global image. The present invention first uses the edge extraction operator to obtain the continuous edge features of the target workpiece. In order to eliminate the interference of noise and illumination changes, the edge feature image is extracted , propose a search strategy based on edge point distance (such as Figure 5 shown). According to the rotation change, the minimum circumscribed rectangle of the workpiece is established, the possible deflection angle of the target is obtained, a template set is generated, and the target recognition is completed through a search strategy. Establish the conversion relationship between the vision system and the robot system, remove repetitive information, and complete the positioning and grasping of the target w...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a visual guiding-based robot workpiece grabbing method, and belongs to the field of robot visual recognition and positioning. The method comprises the following steps of (1) establishing templates of target workpieces in advance, collecting images of the target workpieces, and acquiring edge feature images of the templates and edge feature images of the target workpieces; (2) acquiring the deflection angle of each target workpiece relative to the corresponding template by utilizing the minimum enclosing rectangle of the edge feature image of the target workpiece, and establishing a compensation template set by utilizing the deflection angles and the edge feature images of the templates; (3) matching the edge feature images of the target workpieces and the compensation template set, and recognizing the target workpieces and acquiring the center coordinates of the target workpieces under a visual coordinate system; (4) converting the visual coordinate system to a robot user coordinate system; and (5) selecting a reference point, and uniquely identifying the same target workpiece by utilizing the time and the position to complete positioning grabbing. The targetworkpieces can be accurately recognized by utilizing the method.

Description

technical field [0001] The invention belongs to the field of robot visual recognition and positioning, and in particular relates to a method for grabbing a robot workpiece based on vision guidance. Background technique [0002] In recent years, thanks to the country's vigorous development of the robot industry, robot technology has been able to flourish, and people's requirements for production line automation have continued to increase. Therefore, robots are also widely used in various industrial production links. The traditional robot sorting process generally uses teaching or offline programming to control the movement of the robot, which is difficult to adapt to the complex and changeable working environment in the current industrial production. The emergence of vision technology makes the robot have a higher degree of intelligence And stronger environmental adaptability, robots equipped with vision systems have been widely used in the fields of electronic appliances, au...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/1697
Inventor 陈瀚宁何茂伟苏卫星梁晓丹刘芳孙丽玲薛永江
Owner TIANJIN POLYTECHNIC UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products