Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion synthesizing and editing method based on motion capture data in computer bone animation

A technology of motion capture and skeletal animation, which is applied in computing, animation production, image data processing, etc., can solve problems such as difficult expansion of data sets, and achieve the effect of strong expressive ability

Inactive Publication Date: 2013-02-27
BEIHANG UNIV
View PDF2 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these models have their own limitations
They are either difficult to extend to other data sets, or it is difficult to adjust the model to deal with some similar problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion synthesizing and editing method based on motion capture data in computer bone animation
  • Motion synthesizing and editing method based on motion capture data in computer bone animation
  • Motion synthesizing and editing method based on motion capture data in computer bone animation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Below in conjunction with accompanying drawing and example the present invention is described in further detail:

[0035] The implementation process of the present invention includes four main steps: motion data preprocessing and prior information labeling, defining a random process and specifying the kernel function of each factor, constructing an objective function to solve unknown parameters, constructing a generative model, and using the generative model to realize the synthesis and editing of motion . figure 1 A schematic diagram of the overall process of the present invention is shown.

[0036] Step 1: Motion data preprocessing and prior information labeling:

[0037] The first stage: motion data preprocessing:

[0038] The main thing to do in the motion data preprocessing stage is to calculate the feature vector corresponding to each frame of the motion data. Suppose, the current given training motion data set: Q={Q j |j=1,2,...,j}. where J is the total numb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a motion synthesizing and editing method based on motion capture data in computer bone animation. The method comprises the following steps: simply preprocessing the given motion data set, and marking required prior information for constructing and generating a model; defining multi-factor Gaussian random process according to the marked information to model the motion data set; constructing an objective function based on MAP (maximum posterior estimation) according to the constructed model, and using SCG (scaled conjugate gradient) optimization algorithm to solve the unknown function to obtain the generative model; and at last, editing hidden variable factors, and utilizing the obtained generative model to realize various motion synthesizing and motion editing methods, for example: style transfer, style interpolation and motion redirection. The method has the characteristics as follows: the multi-factor Gaussian process is utilized to model a group of less motion data to obtain the generative model, and the generative model is utilized to realize a series of interactive and visual motion editing methods.

Description

technical field [0001] The invention belongs to the technical field of computer virtual reality, in particular to a motion synthesis and editing method based on motion capture data in computer skeletal animation. Background technique [0002] The movement of the virtual character greatly increases the sense of reality and immersion of the virtual scene. The movement of virtual characters is a classic research content in the fields of virtual reality and computer animation. There have been methods based on keyframes, kinematics, and dynamics for synthesizing the motion of virtual characters. In recent years, with the continuous maturity and practicality of motion capture equipment, it has become possible to obtain a large amount of realistic motion data. However, due to the complexity and variability of human motion, capturing all human motion is unrealistic and impossible. In addition, motion capture equipment is expensive, poor portability and other reasons also limit th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T13/00G06T7/20
Inventor 梁晓辉王剑郭承禹
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products