Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Extraction method of key frame of 3d human motion data

A technology of human body movement and extraction method, which is applied in image data processing, electrical digital data processing, special data processing applications, etc., and can solve the problems of low computing efficiency and accuracy.

Inactive Publication Date: 2007-05-23
ZHEJIANG UNIV
View PDF0 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention provides a highly efficient and automatic method for extracting key frames of three-dimensional human motion data in order to overcome the disadvantages of low operational efficiency and accuracy of the above-mentioned existing methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Extraction method of key frame of 3d human motion data
  • Extraction method of key frame of 3d human motion data
  • Extraction method of key frame of 3d human motion data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0087] As shown in Figure 4, an example of key frame extraction for a punching movement is given. Below in conjunction with the method of the present invention describe in detail the concrete steps that this example implements, as follows:

[0088] (1) Use an optical motion capture system to capture a section of punching motion data with a length of 100 frames;

[0089] (2) with the original motion data of TRC format that captures in step (1) as input, adopt existing motion data conversion method that TRC data is converted into the rotation data representation format that meets the definition of the present invention with 16 articulation points;

[0090] (3) Based on the standard human body model obtained in step (2) and the rotational motion data, the angle between the limb bones and the central bone is calculated by using the formula (1) in the claim to form an octet sequence representing the posture of the human body Fi;

[0091] (4) The user specifies the number n of key...

Embodiment 2

[0096] As shown in Figure 5, an example of extracting different numbers of key frames for a piece of stair climbing motion data is given. Below in conjunction with the method of the present invention describe in detail the specific steps that this example implements, as follows:

[0097] (1) The input is a stair climbing motion sequence with a length of 85 frames obtained by an optical motion capture system, and the data file is the original motion data in TRC format;

[0098] (2) adopt existing data format conversion method to convert the motion data of TRC format into the rotary motion data that meets the requirements of the present invention with 16 standard articulation points, promptly have the BVH data format of 16 designated articulation points;

[0099] (3) based on the BVH format data that step (2) obtains, adopt the formula (1) in the claim to calculate the included angle between limb skeleton and central skeleton, form the octet sequence Fi that represents human pos...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an automatic key frame extraction method of the human sport data. On the data representation, it uses the separation angle of the human limbs bones and central bones as the sports characterization, and tokens the three-dimensional human sport data. Then, it makes the possible existing ''border'' posture to be a candidate key frame set when determining the movement according to the sport track of the angle between bones. Finally, it selects the candidate key frame to get the final key frame set by layered curve reduced algorithm. Meanwhile, the invention also proposed an adaptive mistake parameter adjustment method to meet the requirements of different compression rate. The method has good effects of key frame extraction and compression to human sport data, and guarantees the consistency in the similar sport key frame set to some extent.

Description

technical field [0001] The invention relates to the field of computer three-dimensional animation technology and multimedia data processing, in particular to a method for extracting key frames of three-dimensional human motion data. Background technique [0002] In recent years, with the widespread use of motion capture devices, a large amount of realistic 3D human motion data has been generated, and these data are widely used in computer games, animation generation, and medical simulation and other fields. Since human motion is captured at a high sampling frequency, it is very useful to extract key (posture) frames from motion data in order to facilitate compressed storage, retrieval, browsing and further motion editing of a large amount of 3D human motion data. [0003] Key frame extraction is a technology widely used in the field of video analysis and retrieval, such as video key frame extraction methods based on shot boundaries, color features, motion analysis, and clust...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30G06T7/00
Inventor 庄越挺肖俊吴飞杨涛
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products