Method for measuring and calculating human-robot minimum distance in collaborative environment

A collaborative environment and the smallest technology, applied in the field of robotics, can solve problems such as unfavorable real-time performance, reduced precision, and large amount of calculation

Active Publication Date: 2018-01-09
SOUTHEAST UNIV
View PDF9 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In a collaborative environment, when the position and posture of the human body are dynamically changing, the corresponding convex figure that constitutes the human body is also changing. It is not conducive to the realization of real-time performance; 2) Human-machine distance calculation method based on planar image: through the planar vision camera installed on the top of the collaborative space, the scene image in the collaborative space is obtaine

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for measuring and calculating human-robot minimum distance in collaborative environment
  • Method for measuring and calculating human-robot minimum distance in collaborative environment
  • Method for measuring and calculating human-robot minimum distance in collaborative environment

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0067] The technical solution of the present invention will be further described in detail below in conjunction with the accompanying drawings:

[0068] This specific embodiment discloses an iterative calculation method of the minimum distance between man and machine in a collaborative environment, including the following steps:

[0069] S1: Use the position measurement and calibration system to determine the relative pose relationship between the robot and the 3D vision sensor, and obtain the conversion relationship between the robot base coordinate system and the 3D vision sensor coordinate system;

[0070] S2: According to the relative position relationship between the parts of the robot and the part rotation relationship, establish the reference coordinate system of the part. The origin of the reference coordinate system is the key movement node of the robot, and according to the relative pose between the key movement nodes of the robot Relationship, establish the robot generaliz...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for measuring and calculating the human-robot minimum distance in a collaborative environment. According to the relative position relation and the rotating axes of theparts of a robot, a generalized motion model of the robot is established. Meanwhile, a motion node iteration data set of the robot with a minimum human-robot distance is constructed. Meanwhile, the image data of the human-robot cooperation space is collected through a 3D visual sensor, so that the human body information in the collaborative environment is tracked and identified. The skeleton nodedata of a human body are extracted, and a human skeleton node iteration data set with the minimum human-robot distance is constructed. According to the obtained iteration data set, the minimum distance between the human body and the robot in the collaborative environment, and the space coordinates of corresponding points are calculated in the iterative manner. According to the invention, based onthe method for measuring and calculating the human-robot minimum distance in the collaborative environment, an accurate and easy-to-achieve human-robot distance calculation model is built in real time. After that, a human-robot minimum distance and a corresponding point in the collaborative environment are calculated in real time. The safety of the robot can be improved. The human-robot minimum distance in the environment can be efficiently tracked in real time.

Description

technical field [0001] The invention relates to the technical field of robots, in particular to a method for measuring and calculating the minimum distance between man and machine in a collaborative environment. Background technique [0002] In order to make up for the lack of flexibility and intelligence of traditional robots and make robots adapt to wider and more complex task requirements, collaborative robot technology has emerged as the times require. The human-machine collaboration and integration mechanism combines the "computing power, strength, precision, and repeatability" of robots with the "flexibility, experience, knowledge, analysis, and decision-making capabilities" of humans to achieve complementary advantages. [0003] Compared with traditional industrial robots, the cooperation and integration between humans and robots removes the isolation protection between humans and machines, combines the advantages of humans and machines, increases flexibility and safe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/73G06T7/10
Inventor 王政伟甘亚辉戴先中
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products