Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Kinect-based Darwin robot joint mapping analysis method

A technology of robot joints and analysis methods, applied in the field of Kinect-based Darwin robot joint mapping analysis

Active Publication Date: 2017-07-21
SOUTH CHINA UNIV OF TECH
View PDF7 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the different degrees of freedom of human body joints and humanoid robots, as well as the physical constraints of humanoid robots, humanoid robots are limited to directly imitate human behavior.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Kinect-based Darwin robot joint mapping analysis method
  • Kinect-based Darwin robot joint mapping analysis method
  • Kinect-based Darwin robot joint mapping analysis method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0075] This embodiment discloses a Kinect-based Darwin robot joint mapping analysis method. The analysis method first utilizes Kinect to collect human joint bone data, and then maps the Kinect human joint bone data to the joints of the Darwin robot after joint analysis, and makes it The Darwin robot imitates human actions to the greatest extent. The method specifically includes the following steps:

[0076] S1, using Kinect to collect the skeleton data of the joint point P of the human body;

[0077]In this step, OpenNI can be used to read the 24 joint point information of Kinect, which are: head, neck, torso, waist, left collar, left shoulder, left elbow, left wrist, left hand, left fingertip, right collar, right Shoulder, right elbow, right wrist, right hand, right fingertip, left hip, left knee, left ankle, left foot, right hip, right knee, right ankle, right foot, only 15 joint data are valid in practical applications, respectively For: head, neck, torso, left shoulder, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a Kinect-based Darwin robot joint mapping analysis method. The analysis method comprises the following steps of 1) obtaining human skeleton information; 2) establishing a reference coordinate system and constructing skeleton vectors; 3) performing Euler rotation analysis; 4) performing gravity center stability analysis; and 5) performing joint angle calculation. According to the method, 15 joint points of a human body are obtained by utilizing Kinect; after compensation and slide average filtering, 10 unit joint vectors are constructed in the reference coordinate system; the Euler rotation analysis is performed according to a joint cascade mode of a Darwin robot, thereby obtaining corresponding angles of common joints of the Kinect and the Darwin robot; and finally foot expansion joints of the Darwin robot are subjected to the gravity center stability analysis, thereby obtaining foot joint angles of stable standing of single or double feet of the Darwin robot, so that action imitation from the Kinect to the Darwin robot is realized.

Description

technical field [0001] The invention relates to the technical field of robot simulation, in particular to a Kinect-based Darwin robot joint mapping analysis method. Background technique [0002] Humanoid robot is one of the important research directions in the field of robotics. Body perception, as a natural form of human-computer interaction, can recognize and convey human intentions very well during the interaction process, and carries a very rich amount of information. The current research on robot technology is developing in the direction of intelligence and diversification. In order to meet the special requirements of various complex and harsh environments, the robot needs to have higher adaptability and a certain learning ability, and the robot needs to be able to move autonomously. In designated places, work is carried out in industrial, medical, aviation, national defense fields and dangerous places. However, traditional robots are often only aimed at specific tasks...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/50
CPCG06F30/20
Inventor 邓晓燕郑镇城林灿光潘文俊张广涛
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products