Supercharge Your Innovation With Domain-Expert AI Agents!

Visual mechanical arm automatic grabbing method and system oriented to moving objects

A technology of mechanical arms and objects, applied in the field of industrial intelligent control, can solve the problems of low perception accuracy and poor real-time performance, and achieve the effect of reducing production costs and reducing costs

Pending Publication Date: 2021-03-19
GUANGDONG UNIV OF TECH
View PDF8 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In order to overcome the technical defects of poor real-time performance and low perception accuracy in the existing automatic grasping method of manipulators, the present invention provides a visual manipulator automatic grasping method and system for moving objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual mechanical arm automatic grabbing method and system oriented to moving objects
  • Visual mechanical arm automatic grabbing method and system oriented to moving objects
  • Visual mechanical arm automatic grabbing method and system oriented to moving objects

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0081] Such as figure 1 As shown, the automatic grasping method of visual manipulator for moving objects includes the following steps:

[0082] S1: Calibrate the grasping point of the manipulator in the two-dimensional image;

[0083] S2: Use the target detection algorithm to identify and locate the items, and use the position information of all items identified by the target detection as the input of the multi-target tracking deep learning algorithm to obtain the position information of all target items in the image in real time;

[0084] S3: Generate a sequence list of grabbing items according to the grabbing priority of the target item, and use the predictive item motion algorithm to predict the grabbing point of the currently grabbed item while performing real-time target tracking;

[0085] S4: Use the inverse kinematic solution to control the end of the robotic arm to move and track the object, so that the calibrated grabbing point coincides with the currently grabbed it...

Embodiment 2

[0112] More specifically, on the basis of Example 1, such as Image 6 As shown, the present invention also provides a visual robotic arm automatic grasping system for moving items, including a robotic arm, a controller, a processor, and a detection device; wherein:

[0113] The control end of the mechanical arm is electrically connected to the controller;

[0114] The detection device is arranged on the mechanical arm, the control end of the detection device is electrically connected to the controller, and the output end of the detection device is electrically connected to the processor;

[0115] The controller is electrically connected with the processor to realize information interaction; wherein:

[0116] The processor is provided with a target detection algorithm, a multi-target tracking deep learning algorithm, an algorithm for predicting object motion, and an inverse motion algorithm; the specific operating principles of the system are:

[0117] First, the detection eq...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a visual mechanical arm automatic grabbing method oriented to moving objects, through which a plurality of target objects are recognized, located and tracked through target detection and target tracking of deep learning. The method includes steps: generating a grabbing object sequence list according to the grabbing priority of the target objects, and predicting a grabbing point of a currently grabbed object by adopting an object motion prediction algorithm; controlling the tail end of a mechanical arm to move to track the object by employing an inverse kinematic solutionso that a calibration grabbing point coincides with the currently grabbed object on a two-dimensional image; and obtaining the precise position of a currently-grabbed target through laser ranging incooperation with a mechanical arm inverse motion algorithm so that automatic grabbing of the mechanical arm is achieved. The invention provides an automatic grabbing system of a visual mechanical arm,through which the problem of multi-target grabbing precision of the mechanical arm is effectively solved, a visual sensing end of the automatic grabbing system can track the target objects in real time, and the automatic grabbing system can be applied to grabbing of the moving objects by the mechanical arm; and the system only needs one monocular RGB camera and one laser ranging module so that the production cost is reduced.

Description

technical field [0001] The present invention relates to the technical field of industrial intelligent control, and more specifically, to a method and system for automatic grasping of a visual manipulator oriented to moving objects. Background technique [0002] At present, the automatic grasping of the manipulator mainly uses the camera and the manipulator to calibrate the hand and eye, obtains the coordinate relationship between the camera and the manipulator, and finally transfers the result of visual recognition to the robot coordinate system, and then controls the manipulator to grasp. Pick. This method requires the use of an RGB-D camera with a depth sensor or a binocular camera. The RGB-D camera has a short-range blind spot. -D and binocular cameras are expensive. [0003] To solve this problem, the existing solutions are as follows: [0004] 1. An auxiliary grasping method based on visual recognition [1] Kalashnikov D, Irpan A, Pastor P, etal. QT-Opt: Scalable Deep...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16B25J19/02
CPCB25J9/1697B25J19/023Y02P90/02
Inventor 苏萌韬
Owner GUANGDONG UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More