Unlock instant, AI-driven research and patent intelligence for your innovation.

A robot grasping system based on depth vision and its using method

A deep vision, robotics technology, applied in the field of robotics, can solve the problems of lubrication lag, easy pollution of the environment, lack of lubrication of the slide rail, etc., to achieve the effect of easy adjustment and avoid scanning blind spots

Active Publication Date: 2022-07-08
山东润江智控科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Install the depth vision sensor at the front end of the robot tool at 3 to 5 cm. This installation method is to avoid the impact of various harsh environments on site, but this installation method will cause problems: the robot’s running track is limited, It must run in a direction perpendicular to the structured light of the depth vision sensor, and the measurement is susceptible to changes in the attitude of the industrial robot itself
[0003] In addition, the slide rail of the walking axis (seventh axis) of the robot system has to withstand high-speed and high-pressure friction back and forth during the working process. Effective lubrication for it is a very important maintenance method to ensure the normal operation of the slide rail. Currently, there are There are a large number of lubricating devices in the technology, but there are still some problems in the process of using them. For example, when lubricating and maintaining, the lubricating oil is usually added manually at regular intervals, and the oil film on the surface of the slide rail is mostly observed with the naked eye. For maintenance, this method is likely to cause lubrication lag problems, resulting in the slide rails not being lubricated in time and effectively, resulting in greater wear, and the manual lubrication method is time-consuming and labor-intensive. In addition, the existing device exerts pressure on the entire box surface , and then ensure that the oil can be fully overflowed, but this method is not easy to control the oil output speed and oil output, and it is prone to excessive overflow of lubricating oil, which is a waste of cost and easy to pollute the environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A robot grasping system based on depth vision and its using method
  • A robot grasping system based on depth vision and its using method
  • A robot grasping system based on depth vision and its using method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030]The following describes in detail the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are exemplary, only used to explain the present invention, and should not be construed as a limitation of the present invention.

[0031] like Figure 1-7 As shown, a robot grasping system based on depth vision proposed in this embodiment includes a robot body, the robot body includes a mechanical arm 1, a gripper 2 is connected to the mechanical arm 1, and a rotating seat 3 is connected to the bottom of the mechanical arm 1 , the bottom of the rotating base 3 is connected with the base 4, the bottom of the base 4 is provided with a seventh shaft, the seventh shaft includes a ground rail 7, the ground rail 7 is prov...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot grasping system based on depth vision and a method of using the same. A seventh shaft is arranged below, the seventh shaft includes a ground rail, a slide rail is arranged on the ground rail, the slide rail is slidably connected with a slider, the top of the slider is fixedly connected with a support table, and the support table is connected with a driving mechanism; The base is connected; the support table is also provided with a depth vision sensor, the depth vision sensor is connected to the lift, and the bottom of the lift is connected to the support table; the support table is provided with a lubrication system. The invention installs the depth vision sensor and the robot on the same support table, which effectively avoids the scanning blind area caused by the change of the robot's posture; the lubricating system has an ingenious structure design, and uses the change of the guide groove to realize the relative position between the piston and the plunger Change, you can complete the oil extraction and lubrication.

Description

technical field [0001] The invention relates to the field of robot technology, in particular to a robot grasping system based on depth vision and a method for using the same. Background technique [0002] In the field of robot applications, especially when moving in one direction over a long distance, because the running trajectory is too long to teach, it is necessary to calculate the theoretical trajectory of the robot in advance, and then perform small-scale corrections to the running pose of the robot during the running process. At present, the solution to this problem is usually to install a depth vision sensor on the robot tool, and project the structured light of the depth vision sensor on the front end of the robot tool 3 to 5 cm. During the operation of the robot, according to the position of the measurement point and The difference between the current positions of the robot tools corrects the robot trajectory. The depth vision sensor is installed 3 to 5cm from the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J19/00B25J5/02B25J13/08
CPCB25J19/0062B25J5/02B25J13/08
Inventor 刘建勇李书振李鑫韩玉冰
Owner 山东润江智控科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More