Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual and efficient three-dimensional human body movement data retrieval method based on demonstrated performance

A technology of human motion and data retrieval, which is applied in the field of retrieval of 3D human motion data, and can solve problems such as difficult description of input information, unintuitive retrieval efficiency, and low efficiency

Inactive Publication Date: 2010-02-10
ZHEJIANG UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that the user may not have a suitable sample as input
This method solves the problem that animators are difficult to describe, not intuitive enough, and have low retrieval efficiency when inputting information when using massive motion databases for motion retrieval.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual and efficient three-dimensional human body movement data retrieval method based on demonstrated performance
  • Visual and efficient three-dimensional human body movement data retrieval method based on demonstrated performance
  • Visual and efficient three-dimensional human body movement data retrieval method based on demonstrated performance

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0060] First, preprocess the large-capacity motion database, perform feature extraction and segmentation for each motion data, and construct a motion index, and then perform intuitive and efficient 3D human motion data retrieval based on demonstration performances according to the process, see image 3 . Connect the motion sensor, set its global reference coordinate system, and input the standard 3D human skeleton, see figure 1 (a), if the user only pays attention to local body movements, such as the movement of the upper body, only need to select 5 feature nodes of the chest, left elbow, right elbow, left wrist, and right wrist, and use 5 motion sensors to obtain demonstration performance movements of the upper body , specify the corresponding relationship between the motion sensor and joint points through program interface interaction or input configuration files, and then place the motion sensor on the corresponding parts of the human body, seefigure 1 (b), after pose align...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an intuitive and efficient three-dimensional human motion data retrieval method based on demonstration performance. The method comprises the following steps: first, to constitute a motion index for a large-capacity motion database through subspace partition; second, to set an overall reference coordinate system of a motion sensor; third, to load a standard three-dimensionalhuman skeleton; fourth, to designate the corresponding relationship between the motion sensor and the standard three-dimensional human skeleton nodes; fifth, to bind the motion sensor and the standard three-dimensional human skeleton through posture alignment; sixth, to drive the standard three-dimensional human skeleton through the data acquired by the motion sensor to get demonstration performance motion; seventh, to make feature extraction to the demonstration performance motion through subspace partition; eighth, to load the motion index and carry out qualitative retrieval; ninth, to setparameters, and then to carry out quantitative retrieval based on the qualitative retrieval. The invention solves the problem of how to intuitively and accurately express the creation intention of animation producers in the process of making character animation to achieve rapid and efficient extraction of required motion data.

Description

technical field [0001] The invention relates to a retrieval method for three-dimensional human motion data, in particular to an intuitive and efficient three-dimensional human motion data retrieval method based on demonstration performances. Background technique [0002] According to the different input information, there are four main methods of motion retrieval: text, notation language, hand-drawn sketches, and motion samples. Traditional text retrieval methods use keyword descriptions as input (such as "first punch and then kick") to search for motion clips with these label attributes, but the content of the text description is vague, incomplete, and there are problems of inconsistent subjective understanding. Not suitable for searching large-capacity sports databases. The notation language focuses on the description of the underlying position and direction of each joint point, and can clearly express the content and meaning of the motion data [1]. However, this method r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F17/30G06T7/20G06T7/246
Inventor 耿卫东梁秀波张顺李启雷
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products