Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic skin covering method and device for character grid model based on neural network

A technology of neural network and grid model, applied in the direction of biological neural network model, neural learning method, neural architecture, etc., can solve the problem of unsatisfactory joint area deformation, and achieve the effect of improving quality and quality

Active Publication Date: 2021-08-10
PEKING UNIV
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, animating 3D characters from motion capture data is a complex and difficult skill that animators spend years trying to master
In addition, different movements, such as bending elbows and squatting, are limited to the inherent limitations of commonly used skinning techniques (Linear Blend Skinning), and the deformation in the joint area is very unsatisfactory

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic skin covering method and device for character grid model based on neural network
  • Automatic skin covering method and device for character grid model based on neural network
  • Automatic skin covering method and device for character grid model based on neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] This embodiment implements a neural network-based automatic skinning method for a character mesh model, such as figure 1 shown, including the following steps:

[0040] S1. Using the 3D character grid and its corresponding action parameters to train the neural network;

[0041] S2. Inputting the 3D character grid to be animated into the trained neural network;

[0042] S3. The trained neural network generates skin binding weights, bones and pose-dependent deformation correction bases;

[0043] S4. Using the fusion coefficient to process the pose-dependent deformation correction base to generate a deformation animation.

[0044] Preferably, said motion comprises joint rotation.

[0045] Specifically, the neural network includes a grid convolutional neural network, a bone-aware convolutional neural network, and a multi-layer perceptron neural network.

[0046] Preferably, the 3D character grid to be animated is in a T pose.

[0047] Preferably, there are nine posture-...

Embodiment 2

[0057] This embodiment implements a neural network-based automatic skinning method for a character mesh model, including the following steps:

[0058]Use the 3D character grid and its corresponding action parameters to train the neural network;

[0059] Input the 3D character mesh that needs to be animated into the trained neural network;

[0060] The trained neural network generates skin binding weights, bones, and pose-dependent deformation correction bases;

[0061] The pose-dependent deformation correction base is processed by using fusion coefficients to generate deformation animation.

[0062] Among them, two branches are included when training the neural network, such as figure 2 As shown, one is the wrapped deformation branch, and the other is the residual deformation branch. The wrap deformation branch predicts the corresponding skin rig weights and bones, and deforms the input model using differentiable deformations. At the same time, the residual deformation br...

Embodiment 3

[0065] This embodiment implements a neural network-based automatic skinning method for a character mesh model, including the following steps:

[0066] Use the 3D character grid and its corresponding action parameters to train the neural network;

[0067] Input the 3D character mesh that needs to be animated into the trained neural network;

[0068] The trained neural network generates skin binding weights, bones, and pose-dependent deformation correction bases;

[0069] The pose-dependent deformation correction base is processed by using fusion coefficients to generate deformation animation.

[0070] The 3D character grid that needs to be animated is a T pose. Specifically, the 3D character grid of the T pose is respectively input into the first grid convolutional neural network and the second grid convolutional neural network. The first grid The grid convolutional neural network generates the skin binding weights, and the second grid convolutional neural network calculates ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of three-dimensional animation, in particular to an automatic skin covering method and device for a character grid model based on a neural network. The method comprises the following steps: training a neural network by using a three-dimensional character grid and corresponding action parameters; inputting a three-dimensional character grid needing animation into the trained neural network; using the trained neural network to generate a skin binding weight, skeleton and posture dependent deformation correction substrate; and processing the posture-dependent deformation correction substrate by using a fusion coefficient to generate a deformation animation. According to the invention, the neural network is trained by using the data captured by the motion to generate the skin binding weight and the skeleton, so that the topology of the generated skeleton can be controlled, the deformation quality of the three-dimensional character grid can be improved, and particularly, the quality is greatly improved when the joint area of the three-dimensional character grid is processed.

Description

technical field [0001] The present application relates to the technical field of 3D animation, and more specifically, the present application relates to a neural network-based automatic skinning method and device for a character mesh model. Background technique [0002] In recent years, in the field of computer graphics, in order to deform and animate the 3D mesh of biped characters driven by motion capture data, it usually needs to go through tedious rigging and skin weight binding. set (skinning). [0003] However, animating 3D characters from motion capture data is a complex and difficult skill that animators spend years trying to master. In addition, different movements, such as elbow bending and squatting, are limited to the inherent limitations of commonly used skinning techniques (Linear Blend Skinning), and the deformation in the joint area is very unsatisfactory. The details of many deformations need to be further refined, and corrective deformations for specific ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/20G06T13/20G06N3/04G06N3/08
CPCG06T17/20G06T13/20G06N3/08G06N3/045
Inventor 李沛卓刘利斌陈宝权
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products