Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Human Pose Mapping Method Applied to Action Imitation of Humanoid Robot

A humanoid robot, human body posture technology, applied in the field of human-computer interaction, can solve the problems of no joint angle analysis, limit the similarity of humanoid robots imitating human body posture and action, and achieve fast calculation speed, improve imitation similarity, and small calculation loss. Effect

Active Publication Date: 2019-12-10
SOUTH CHINA UNIV OF TECH
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, most of the researches using the analytical method of geometric analysis have not combined the characteristics of the human body and the characteristics of the joint structure of the robot to accurately analyze the calculation of the joint angle, so to a certain extent, it limits the similarity of the humanoid robot's imitation of human gestures and actions.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Human Pose Mapping Method Applied to Action Imitation of Humanoid Robot
  • A Human Pose Mapping Method Applied to Action Imitation of Humanoid Robot
  • A Human Pose Mapping Method Applied to Action Imitation of Humanoid Robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0037] This embodiment discloses a human body posture mapping method applied to humanoid robot action imitation, using Kinect II as a depth camera, and Nao robot as a humanoid robot imitator and a mapping object of human body posture.

[0038] The specific implementation of the human body posture mapping method applied to humanoid robot action imitation, the flow chart is as follows figure 1 shown, including the following steps:

[0039] S1. Obtain the three-dimensional position information of human skeleton nodes through the Kinect II depth camera;

[0040] S2. Construct human skeleton vectors according to human skeleton nodes, and establish human virtual joints according to human skeleton vectors and robot joint structures to form human skeleton models;

[0041] S3. Establish a link reference coordinate system based on each link of the human skeleton model;

[0042] S4. Calculate the joint angles of the human body by using the human bone vector according to the structural ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body posture mapping method applied to action imitation of an anthropomorphic robot. The method comprises the following steps that 1) the three-dimensional position information of notes of human skeleton series is acquired through a depth camera; 2) vectors of the human skeleton series are constructed according to the notes of the human skeleton series, virtual human joints are established according to the vectors of the human skeleton series and the robot joint structure so as to form a human skeleton model; 3) a connecting rod reference coordinate system basedon connecting rods of the human skeleton model is established; 4) the human body mapping joint angles are calculated by utilizing the human body connecting rod skeleton vectors according to the structure characteristics of robot joints; and 5) the human body mapping joint angles are applied to robot joints according to the limit of each joint angle of the robot. The method has the advantages thatthe human body mapping joint angles can be accurately calculated by adopting a geometric analysis method based on the connecting rod skeleton vectors and the virtual human joints, the human body posture can be mapped to the robot end through the joint angles, and therefore the calculation loss is low, and the accuracy is high.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to a human body posture mapping method applied to motion imitation of a humanoid robot. Background technique [0002] In recent years, robot technology has developed rapidly and has been more and more widely used in fields such as industry, medical treatment, scientific research, education and training, and family daily life. At the same time, the increasingly complex and diverse application environments put forward higher requirements for the adaptability and intelligence of robots. Robot imitation learning technology can improve robot learning efficiency, improve robot intelligence, and free developers from heavy programming work. It is an important part of imitation learning to properly represent the obtained human teaching information and apply it to the robot. The characterization link needs to establish a way to represent the taught actions and map the obs...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16
CPCB25J9/1605B25J9/1612B25J9/1697
Inventor 张智军牛雅儒
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products