Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Interactive three-dimensional human face expression animation editing method and system and extension method

A technology for 3D face and animation editing, applied in animation production, instruments, computing, etc., can solve the problems of inappropriate interactive editing of face model control elements, a large number of offline processing and operations, and unnatural expressions.

Active Publication Date: 2017-05-24
SHANDONG UNIV OF FINANCE & ECONOMICS
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, it is challenging to implement such an interactive expression editing tool because: 1) The face model edited by the user often has thousands or even tens of thousands of vertices, while the number of vertices and curves of the model controlled by the user is only A few at most a dozen, using such low-dimensional control elements to control the deformation of high-dimensional models will lead to under-constraint problems, making the generated expressions unnatural or even wrong
2) Due to lack of experience or low professional level, the interactive editing of the control elements of the face model by some users may be inappropriate or even wrong. This inappropriate or wrong input often directly leads to unnatural or even is weird, and thus cannot achieve the editing effect that the user expects
But these methods require a lot of offline processing and calculation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interactive three-dimensional human face expression animation editing method and system and extension method
  • Interactive three-dimensional human face expression animation editing method and system and extension method
  • Interactive three-dimensional human face expression animation editing method and system and extension method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention.

[0069] figure 1 It is a flowchart of an interactive three-dimensional facial expression animation editing method of the present invention. As shown in the figure, the interactive three-dimensional facial expression animation editing method is completed in the server, and specifically includes:

[0070] Step 1: Map the pixels of the two-dimensional control points of the face model freely designated by the user and moved to the desired position to the three-dimensional space, and obtain the corresponding three-dimensional control points on the face model.

[0071] Specifically, in order to provide users with an intuitive and convenient interaction method, the present ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an interactive three-dimensional human face expression animation editing method and system and an extension method. The three-dimensional human face expression animation editing method comprises the following steps: mapping a two-dimensional control point pixel of a human face model freely appointed by a user and moved to an expected position to a three-dimensional space, so as to obtain a three-dimensional control point corresponding to the human face model; establishing a deformation model of deformation of the three-dimensional control point, caused by editing operation of the user; calculating a correlation coefficient between each vertex and the three-dimensional control point on the human face model respectively; dividing the human face model into different dynamic regions according to the correlation coefficient, wherein each dynamic region comprises one three-dimensional control point; constructing a contribution chart of the three-dimensional control points; carrying out deformation fusion on the dynamic regions according to the contribution chart and the deformation model; finally, obtaining the overall deformation of the human face model. The three-dimensional human face expression animation editing method has the advantages that the calculation is simple, and the deformation of each dynamic region can meet the requirements of users and is also accurate and natural.

Description

technical field [0001] The invention belongs to the field of computer graphics, and in particular relates to an interactive three-dimensional facial expression animation editing method, system and extension method. Background technique [0002] In recent years, 3D facial expression animation has been widely used in various fields, such as expression generation of virtual characters in animation, games and movies; remote network conferences, virtual character animation in virtual reality; medical cosmetology, face recognition Expression simulation; virtual teachers in auxiliary education, virtual hosts and virtual idols in entertainment programs, etc. [0003] With the wide application of 3D facial expression animation, how to provide users with a simple and convenient interactive editing tool, and make the facial expression animation generated by editing be realistic and natural has become a research hotspot and difficult problem generally concerned by the academic and indus...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T13/40
CPCG06T13/40
Inventor 迟静张彩明高珊珊刘慧张云峰
Owner SHANDONG UNIV OF FINANCE & ECONOMICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products