Human body motion posture capturing method and system

A technology of human motion and posture, applied in the field of motion capture, can solve the problems of large number of sensors and high cost of capture

Active Publication Date: 2020-05-15
EZHOU INST OF IND TECH HUAZHONG UNIV OF SCI & TECH +1
View PDF12 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problems existing in the prior art, the embodiments of the present invention provide a method and system for capturing human body movements and postures, which are used to solv

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body motion posture capturing method and system
  • Human body motion posture capturing method and system
  • Human body motion posture capturing method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0059] The present embodiment provides a method for capturing human body movements and gestures, characterized in that the method includes:

[0060] S110. Obtain the original data of each data collection node, and determine the spatial orientation and spatial position of each data collection node according to a complementary filtering algorithm;

[0061] Before the application obtains the original data of each data acquisition node, it is necessary to determine the data acquisition node. In order to reduce the number of sensors used, the application sets up 6 data acquisition nodes. The data acquisition nodes include: head node 1, trunk Node 2, the first forearm node 3, the second forearm node 4, the first leg node 5 and the second leg node 6; each data acquisition node is as figure 2 indicated by the mark.

[0062] After determining the data acquisition nodes, bind the data acquisition modules to each data acquisition node. Each data acquisition module includes a six-axis I...

Embodiment 2

[0116] The present embodiment provides a system for capturing human gestures, such as Image 6 As shown, the system includes: multiple data collection modules, a data collection module 51 and a terminal 52 .

[0117] Before the data acquisition module of the present application obtains the original data of each data acquisition node, the data acquisition node needs to be determined. In order to reduce the number of sensors used, the application sets 6 data acquisition nodes, and the data acquisition node includes: the head Node 1, torso node 2, first forearm node 3, second forearm node 4, first lower leg node 5 and second lower leg node 6; each data acquisition node is as follows figure 2 indicated by the circled mark.

[0118] After determining the data acquisition nodes, bind the data acquisition modules to each data acquisition node. Each data acquisition module includes a six-axis IMU and a three-axis magnetic sensor. Each IMU includes: a three-axis accelerometer and a t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a human body motion posture capturing method and system, and the method comprises the steps: determining the spatial orientation and spatial position of each data collection node based on the original data of each data collection node; determining spatial positions and spatial orientations of the joint nodes; calibrating the spatial orientation of each data acquisition nodeand the spatial orientation of each joint node; correspondingly calibrating the spatial position and the spatial orientation of each joint node, the spatial position and the spatial orientation of thedata acquisition node and the human body model; capturing action postures of a human body by utilizing the human body model; thus, calculating the posture of the first joint node of the human body, and then calculating the posture of the second joint node according to the posture of the first joint node, so the capture of the human body action posture can be realized only by setting six data acquisition points, the use number of sensors is effectively reduced, and the capture cost of the human body action posture is reduced.

Description

technical field [0001] The invention relates to the technical field of motion capture, in particular to a method and system for capturing human motion and posture. Background technique [0002] Motion capture is used to record the process of object movement and simulate it into a digital model. In recent years, with the development of computer data acquisition and sensor technology, motion capture has been used in games, entertainment, sports, military, motion analysis, dance acquisition and virtual widely used in real technology. [0003] The inertial motion capture realizes the reproduction of the three-dimensional posture of the human body by wearing multiple inertial motion sensors. [0004] In the prior art, a large number of sensors needs to be used when capturing human body gestures, resulting in high capture costs. Contents of the invention [0005] Aiming at the problems existing in the prior art, the embodiments of the present invention provide a method and sys...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/01G01C9/00
CPCG06F3/011G01C9/00
Inventor 刘谦曾强
Owner EZHOU INST OF IND TECH HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products