Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human motion synthesis method based on a convolutional neural network

A technology of convolutional neural network and synthesis method, applied in the field of computer animation, can solve problems such as lack of flexibility of data, and achieve the effect of avoiding overfitting

Pending Publication Date: 2019-04-05
DALIAN UNIV
View PDF4 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the lack of flexibility of the captured data, in order to adapt to the new environment, the captured motion data needs to synthesize a new motion with certain difficulty.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human motion synthesis method based on a convolutional neural network
  • Human motion synthesis method based on a convolutional neural network
  • Human motion synthesis method based on a convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0037] like figure 2 Shown, a human motion synthesis method based on convolutional neural network. The method includes the steps of training the model and testing the model:

[0038] Steps to train the model:

[0039] S1: The collected motion capture data is input into the network for training;

[0040] S2: Establish a three-layer convolutional neural network model to realize the regression between high-level parameters and character movement, superimpose an autoencoder network model on the three-layer convolutional neural network, and encode, decode, and train the input motion data;

[0041] S3: Add position constraints, bone length constraints, and trajectory constraints to the hidden unit of the convolutional autoencoder;

[0042] S4: Before the output of the three-layer convolutional neural network, add the character style constraint n...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human motion synthesis method based on a convolutional neural network, and the method comprises the steps: training a model and testing the model: the step of training the model comprises the steps: inputting collected motion capture data into the network for training; Establishing a three-layer convolutional neural network model, superposing an auto-encoder network model on the three-layer convolutional neural network model, and encoding, decoding and training input motion data; Adding a position constraint, a bone length constraint and a track constraint into a hidden unit of the convolutional auto-encoder; And before the three-layer convolutional neural network is output, adding a character style constraint network, and training position constraint, bone length constraint, trajectory constraint and style network extraction at the same time. The step of testing the model comprises the step of inputting tested motion data into the trained model to test whether the data can synthesize motion or not. The method has no data set requirements, and can be expanded to a large number of data sets for parallel processing.

Description

technical field [0001] The invention belongs to the field of computer animation, in particular to a human motion synthesis method based on a convolutional neural network. Background technique [0002] Human motion synthesis technology integrates science and art, reality and abstraction. It is comprehensive and a frontier subject full of challenges. At present, due to the complexity of the characters themselves and the constraints of the development of other related disciplines, there are still many problems to be solved in the human motion synthesis technology. At the same time, with the continuous development of motion capture technology in recent years, many new methods have been produced. [0003] The movement trajectory of characters in three-dimensional space is recorded through motion capture technology, mainly the skeletal movements of characters, and then these characters are displayed on the computer screen by analyzing the motion data. Capturing the real motion d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T13/40G06N3/04
CPCG06T13/40G06N3/045Y02T10/40
Inventor 周东生封心珠刘瑞易鹏飞张强杨鑫魏小鹏
Owner DALIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products