Man, machine and object interaction mechanical arm teaching system based on RGB-D image

An RGB-D, RGB image technology, applied in the field of human-machine-object interaction robotic arm teaching system, can solve the problems of high joint action data dimension, long training time of learning model, dimensional disaster, etc., to avoid the dimensional disaster. Effect

Active Publication Date: 2019-11-05
DALIAN UNIV OF TECH
View PDF5 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

High workload when teaching everyday life skills
In addition, the data dimension of the joint action is high, and the training time of the learning model...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man, machine and object interaction mechanical arm teaching system based on RGB-D image
  • Man, machine and object interaction mechanical arm teaching system based on RGB-D image
  • Man, machine and object interaction mechanical arm teaching system based on RGB-D image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] Compared with the traditional teaching method, the teaching system and method of human-computer interaction manipulator based on RGB-D image proposed by this method is a task-oriented, high-level and simple teaching method, ignoring each joint directly output abstract high-level actions, interact with objects in the RGB image space and select actions, control the manipulator to operate objects in the actual space, and can teach in the simulation environment, with high efficiency, safety and low loss, and can be realized data sharing.

[0042] In the embodiment, the following implementation modes are adopted:

[0043] The construction steps of the robot teaching platform are as follows:

[0044] Step (1): Call the libfreenect2 driver to obtain the camera image data, subscribe to the RGB image and the point cloud image with a pixel size of 960×540, and perform the calibration of the RGB camera, the depth camera and the relative pose of the two according to the two, and e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of robot technology and application and relates to a man, machine and object interaction mechanical arm teaching system based on an RGB-D image. The teaching systemrecognizes an objected on the basis of the RGB-D image. By means of tf tree and Kinect V2 point cloud information in an ROS system, the object and a mechanical arm are unified to the same coordinate to locate the object. According to the human behavior habit, the high level actions of the mechanical arm are planned on the basis of MoveIt. In the teaching process, by selecting the object in an operation interface, the type and pose of the object can be obtained, then one action is selected from the high level actions in a centralized manner, the mechanical arm is controlled to operate the corresponding object in the actual space, and multi-step interaction forms the teaching track. The teaching system can achieve man, machine and object intelligent interaction facing the task level, can replace an actual robot system to carry out teaching study, and the system has the characteristics of being high in efficiency, convenient, safe and the like.

Description

technical field [0001] The invention belongs to the field of robot technology and application, and relates to a human-machine-object interactive manipulator teaching system based on RGB-D images. Background technique [0002] In the process of machine learning, human-computer interaction is usually an effective way. The work related to interaction in robot acquisition skills mainly includes virtual reality (Virtual Reality, VR) teaching and offline programming. Both of the above two methods need to establish a 3D scene model in advance, and the performance of real-time perception and adaptation to changing environments is poor. The RGB-D camera can solve this problem very well. It can directly obtain the depth information of the three-dimensional space and present it in the form of point cloud. It is often used to observe and capture the behavior sequence of the trainer, and then adapt to the Robot joints are the main way for robots to obtain environmental information. [...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): B25J9/00
CPCB25J9/0081
Inventor 刘冬丛明卢彬鹏邹强于洪华
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products