Unlock instant, AI-driven research and patent intelligence for your innovation.

Animation character facial expression generation method and system based on facial expression recognition

A facial expression recognition and facial expression technology, which is applied in the acquisition/recognition of facial features, animation production, neural learning methods, etc. limited and other problems, to achieve the effect of improving the user experience, reducing the production cost, and improving the capture effect

Pending Publication Date: 2021-08-13
SHANGHAI JIAO TONG UNIV
View PDF7 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But in specific use, due to the limited resolution, it is difficult to capture subtle face geometry and motion changes
It is also difficult to capture changes in facial expressions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Animation character facial expression generation method and system based on facial expression recognition
  • Animation character facial expression generation method and system based on facial expression recognition
  • Animation character facial expression generation method and system based on facial expression recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] A method for generating facial expressions of animated characters based on facial expression recognition, comprising the following steps:

[0046] S1, face data set and animation data set recognize face and animation character expression through emotion recognition network, and match face and animation data pictures;

[0047] The face data sets used in this embodiment come from CK+, DISFA, KDEF and MMI. Each of the datasets selects frontal face pictures with six-degree emotions and seven neutral labels. Rotate, zoom and other operations are performed on the pictures under each label in the data set to perform data enhancement, so that the number of pictures under each label is equal, and finally about 10,000 pictures are obtained. The illustrations in this example and the pictures showing the experimental results are all from the above data sets.

[0048] The 3D animation data set used in this embodiment comes from the FERG-3D-DB data set, including four characters of...

Embodiment 2

[0076] combine Figure 1 to Figure 5 , a facial expression generation system for animated characters based on facial expression recognition, including:

[0077] The data preprocessing module recognizes facial and animation character expressions in face datasets and animation datasets through deep convolutional networks, and matches faces and animation data images;

[0078] The offline training module obtains the mapping relationship between face pictures with the same expression and character bone parameters through deep learning;

[0079]The online generation module first inputs the key frame of the human face to the offline training module to obtain the bone parameters of the character; then uses the connection between the front and rear key frames to interpolate the bone parameters of the character obtained; then performs three-dimensional Reconstruct to obtain the motion parameters of the character, and finally combine the geometric information of the face picture to opti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of animation production, and discloses an animation character facial expression generation method based on facial expression recognition. The method comprises the following steps of S1, using a facial data set and an animation data set to recognize the facial expressions and animation character expressions through an emotion recognition network, and matching the facial data pictures and animation data pictures; S2, obtaining an animation training network through deeply learning a mapping relation between the face pictures of the same expression and the character skeleton parameters; S3, aiming at the input of each frame of the video, using the network animation training network to output a skeleton parameter result; and S5, performing three-dimensional reconstruction on the input picture to obtain the motion parameters of a character, and optimizing the skeleton parameters in combination with the geometric information of the face pictures. Correspondingly, the invention also discloses an animation character facial expression generation system based on facial expression recognition. According to the present invention, the geometric features of the human face, such as the mouth, the eyes, etc., are controlled more finely, so that the perception degree of the audience on the emotion change of the character is improved.

Description

technical field [0001] The invention belongs to the technical field of animation production, in particular to a method and system for generating facial expressions of animation characters based on facial expression recognition. Background technique [0002] In facial motion capture, traditional methods such as ARkit use the camera to extract facial geometric information and map it to a 3D model. By learning the mapping relationship between 2D video and 3D model parameters, the parameter information of the 3D model is obtained. In addition, some commercial software such as Faceware can also obtain the parameter information of the 3D model by reconstructing the 2D input image. These methods can effectively extract the geometric information of the face, but it is difficult for the audience to feel the change of the character's expression. Models such as ExprGen and DeepExpr try to improve this problem, but while optimizing the expression information, important facial geometri...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06T13/40G06T15/02G06T17/00G06N3/04G06N3/08
CPCG06T13/40G06T15/02G06T17/00G06N3/04G06N3/08G06V40/174G06V40/168G06V40/172G06F18/22G06F18/214
Inventor 潘烨张睿思
Owner SHANGHAI JIAO TONG UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More