Three-dimensional human body movement data dividing method

A human motion and data segmentation technology, which is applied in image data processing, image data processing, instruments, etc., can solve the problems of time-consuming establishment, difficult application, and great impact on the training data of segmentation results, achieving low computational complexity, The effect of high degree of automation and high computational efficiency

Inactive Publication Date: 2007-06-06
ZHEJIANG UNIV
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The model-based motion data segmentation method needs to be based on a large amount of training data, and the segmentation results are greatly affected by the training data, so it is difficult to be applied in practice
[0004] The method in the first issue of "Journal of Autonomous Robotics" in 2002 (Autonomous Robots, 2002, 12 (1): 39-54) uses the method of detecting the zero-crossing point of the joint angle data to segment the human arm motion data. This method is very simple to implement. But the accuracy of the segmentation results is not high
The method published in the Proceedings of Graphi...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional human body movement data dividing method
  • Three-dimensional human body movement data dividing method
  • Three-dimensional human body movement data dividing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0103] As shown in Figure 5, an example of performing ISOMAP dimensionality reduction and precise segmentation on a three-dimensional motion sequence containing two human behaviors is given. Below in conjunction with the method of the present invention describe in detail the concrete steps that this example implements, as follows:

[0104] (1) Obtain a 3D human animation sequence generated by an optical motion capture system or professional animation production software. The data in this example comes from an optical motion capture system (TRC data format), including two behaviors of normal walking and sideways walking ways, and there is a natural transition between them;

[0105] (2) with the original motion data of TRC format captured in step (1) as input, adopt existing motion data conversion method that TRC data is converted into the rotation data representation format that meets the definition of the present invention with 16 articulation points and The translation and r...

Embodiment 2

[0110] As shown in Figure 6, the result of segmenting a three-dimensional motion sequence containing various human behaviors is given. Below in conjunction with the method of the present invention describe in detail the concrete steps that this example implements, as follows:

[0111] (1) This example uses the optical motion capture system to obtain an original 3D human motion sequence in TRC format, including a variety of behaviors, such as walking, walking sideways, mopping the floor, walking, squatting down to beat the ground, squatting down to scrub the ground, 8 kinds of behaviors, such as standing and scrubbing windows, walking, etc., and there is a natural transition between the behaviors;

[0112] (2) with the original motion data of TRC format captured in step (1) as input, adopt existing motion data conversion method that TRC data is converted into the rotation data representation format that meets the definition of the present invention with 16 articulation points a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a partition method on human motion 3D data. First, use manifold analysis method ISOMAP to map the original human sequence motion data to the low dimensional manifold space; secondly, utilize a heuristic method to detect rough partition points of different types of movement in the motion data sequence, then long sequence of human motion data are partitioned to several segments; last, for every two segments concatenated in time in the low dimensional manifold space use K-average clustering algorithm to compute the accurate partition point.

Description

technical field [0001] The invention relates to the field of computer three-dimensional animation technology and multimedia data processing, in particular to a method for segmenting three-dimensional human motion data. Background technique [0002] In recent years, with the widespread use of motion capture devices, a large amount of realistic 3D human motion data has been generated, and these data are widely used in computer games, animation generation, and medical simulation and other fields. Since the human body motion sequence obtained when the optical capture system is used to capture in-body motion is relatively long, and often contains several continuous but different types of motion data (such as walking, running, jumping, etc.), in order to facilitate a large number of 3D human motion data It is very useful for automatic segmentation of different types of motion data contained in long-sequence human motion data. [0003] Data segmentation is a technique widely used ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T1/00
CPCG06K9/00335G06V40/20
Inventor 庄越挺肖俊
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products