Behavior segmentation method based on human body motion capture data character string representation

A technology of human motion and data characters, applied in the field of human motion capture data processing, can solve the problems of inability to judge the same behavior of the sequence to be segmented, high segmentation accuracy, etc., achieve superior effectiveness, good accuracy, and meet actual needs Effect

Active Publication Date: 2015-11-11
北交智轨(北京)科技有限公司
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method cannot determine the same behavior contained in the sequence to be segmented, and there are many inconveniences in practical applications.
Zhou et al. (F.Zhou, F.TorreandJ.K.Hodgins..Hierarchicalalignedclusteranalysisfortemporalclusteringofhumanmotion..IEEETransactionsonPatternAnalysisandMachineIntelligence..2013, page: 582-596.) from Carnegie Mellon University through hierarchical alignment cluster analysis (HierarchicalAlignedClusterAnalysis The method of HACA for short) realizes the behavior segmentation based on human motion capture data. This method converts the behavior segmentation problem into an energy minimization problem, and uses the dynamic programming algorithm to realize the behavior segmentation. This method has a high segmentation accuracy, but this method requires users Determine the number of behaviors in the sequence and the number of clusters in time series reduction in advance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior segmentation method based on human body motion capture data character string representation
  • Behavior segmentation method based on human body motion capture data character string representation
  • Behavior segmentation method based on human body motion capture data character string representation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] In order to illustrate the present invention more clearly, the present invention will be further described below in conjunction with preferred embodiments and accompanying drawings. Similar parts in the figures are denoted by the same reference numerals. Those skilled in the art should understand that the content specifically described below is illustrative rather than restrictive, and should not limit the protection scope of the present invention.

[0053] Such as figure 1 As shown, the behavior segmentation method based on the human body motion capture data string representation provided by this embodiment includes the following steps:

[0054] S1. Take the human body motion capture data as multiple high-dimensional discrete data points, and calculate the Euclidean distance between each data point;

[0055] S2. Perform clustering based on the local density and relative distance of each data point to obtain the class to which each data point belongs, and use differen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a behavior segmentation method based on human body motion capture data character string representation. The method comprises steps of S1, taking human body motion capture data as multiple high dimension discrete data points, and calculating each Euclidean distance between data points; S2, performing clustering through a cluster method based on the local density and the relative distance of each data point to obtain a category to which the data point belongs, and using different characters for representing different categories; S3, reordering the characters according to time sequences of data corresponding to the characters to obtain character strings, and merging the same characters adjacent in time sequence of the character strings to form character groups, wherein each character group forms a behavior string; and S4, segmenting a whole behavior formed by the human body motion capture data according to the behavior strings, and extracting the motion period of each single behavior after segmentation. The technical scheme of the invention is good in accuracy, and excellent in applicability, validity and non-supervision.

Description

technical field [0001] The invention relates to the processing of human body motion capture data in computer animation. More specifically, it relates to an action segmentation method based on string representations of human motion capture data. Background technique [0002] Computer animation is the product of the combination of computer graphics and art. With the rapid development of computer graphics technology and computer software and hardware technology, computer animation has been widely used in many fields such as film and television special effects, 3D games, commercial advertisements, and computer simulation. [0003] In recent years, with the continuous development of hardware technology and the reduction of cost, the motion capture system has gradually become popular, and the optical-based three-dimensional human motion capture (Motion Capture) method has developed into an important means of human motion information acquisition, with a large-scale Human body moti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20G06F17/30
CPCG06F16/5846
Inventor 刘渭滨魏汝翔邢薇薇
Owner 北交智轨(北京)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products