Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Spatial Trajectory Retrieval Method for Human Motion Data

A technology of human movement and movement, applied in electrical digital data processing, special data processing applications, instruments, etc., can solve problems such as the impact of retrieval accuracy, the impact of upper limb movements, and the loss of low-dimensional feature data.

Inactive Publication Date: 2011-12-28
NANJING UNIV
View PDF4 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The problem of using data dimensionality reduction to obtain low-dimensional spatial features for action retrieval is that each frame of human motion has a clear physical meaning, and the low-dimensional feature data obtained after data dimensionality reduction may lose these physical meanings , on the other hand, each dimensionality reduction method is good at dealing with different low-dimensional spatial structures (flow patterns), how to choose the appropriate dimensionality reduction method for different action types is the difficulty of this type of method
The fourth type of method is based on the motion key as described in the literature 4 Liu F, Zhuang Y T, Wu F, et al. 3D motion retrieval with motion index tree. Frame retrieval method, the effectiveness of this method depends on an effective and unified key frame extraction algorithm and appropriate parameter settings. For complex movements such as dance, there are both overall movements and local details. Efficient keyframe extraction algorithms are difficult
[0005] In practical applications, a notable feature of human motion data retrieval is that users pay attention to joints. For example, for boxing motion retrieval, users pay more attention to upper limb movements; for tap dance motion retrieval, users pay more attention to The force is more concentrated on the movements of the lower limbs and footsteps. That is to say, for different types of movements, the joints of interest retrieved by the user are different. The motion characteristics of the joints will have a great impact on the accuracy of retrieval. For example, for walking motions, the upper limbs may be swinging, stationary, or waving. When the user retrieves the motion of walking, if the user uses the The similarity calculation is performed on the action features of all joints. Obviously, various upper body actions will affect the user's expected results.
Among the above methods, the methods of Kovar L and Liu F use the motion features of all joints of the human body for retrieval, and the user cannot specify the joints of interest for human motion data retrieval. MüllerM has noticed that the effective features corresponding to different motion types are different. However, the human-computer interaction method that allows the user to select the corresponding feature for action retrieval is not friendly, at the cost of increasing the complexity of the user's operation. A reasonable way is for the user to submit the retrieval at the same time when entering the retrieval example The attention joint (retrieval joint)
Forbes K also believes that the importance of different joints on the retrieval results is different, but its retrieval method needs to determine the influence coefficient of different joints on the retrieval results before the low-dimensional feature extraction, and it cannot be changed in the future. Not a flexible, efficient way
[0006] In a nutshell, the difficulty in implementing example-based human action data retrieval is that similar human actions have both temporal and spatial deformations. In terms of time, different performers perform the same action differently in time. It may be strictly consistent; from a spatial perspective, similar actions will also cause deformation of the action due to differences in the performer's orientation, bone length, and execution process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Spatial Trajectory Retrieval Method for Human Motion Data
  • A Spatial Trajectory Retrieval Method for Human Motion Data
  • A Spatial Trajectory Retrieval Method for Human Motion Data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0091] The processing flow chart of this embodiment is as follows figure 1 As shown, the whole method is divided into three main steps: definition of human motion structure model, generation of offline motion feature tree, and online motion data retrieval. The main processes of each embodiment are described below.

[0092] 1. Definition of human action structure model

[0093] The hierarchical structure of the human joint model makes the trajectory curve of the joint in three-dimensional space depend on the movement of its own joints and various predecessor joints. For example, the spatial trajectory curve of the end of the left hand relative to the chest joint depends on the left sternoclavicular joint, left shoulder, left elbow, and left wrist. As well as the movement of its own joints, if the spatial trajectory curves of the left-hand end of the two actions are similar, since the spatial trajectory curve of the left-hand end is the cumulative effect of its own and predecess...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a space trajectory retrieval method of body movement data, comprising the following steps: definition of a body movement structural mode: performing movement correlation hierarchical decomposition to body joint mode to form the body movement structural mode comprising five substructures; generation of offline movement feature tree: to each movement in a body movement base,calculating the space trajectory curve of the joint relative to affiliated substructure root node joint and extracting the feature of the space trajectory curve as the movement feature of the joint; generating a movement feature tree comprising five movement feature sub-trees according to hierarchical definitions of the five substructures in the body action structural mode; retrieval of online movement data: submitting retrieval example and assigning retrieval joint by the user; obtaining a retrieval tree according to the assigned retrieval joint and the body movement structural mode; performing feature similarity calculations by the retrieval tree and the movement feature tree according to the hierarchical relationship from top to bottom in order, feeding back result movements after sorting the final similarities from high to low.

Description

technical field [0001] The invention relates to a method for retrieving human body motion data, which belongs to the field of computer three-dimensional animation technology and multimedia data processing, and specifically relates to a spatial trajectory retrieval method for human body motion data. Background technique [0002] Motion capture technology can accurately measure, track, and record the trajectory of objects in three-dimensional space. This technology originated in the late 1970s. After decades of development, the capture technology has become increasingly mature. At present, motion capture has become an important data acquisition method for computer animation, virtual reality, computer vision, and biomedicine. Due to the urgent needs of various applications and the widespread promotion of commercial capture equipment, there have been more and more large-scale 3D human motion libraries, such as the human motion library of Carnegie Mellon University (http: / / mocap....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
Inventor 孙正兴陈松乐周杰项建华
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products