Single-photo-based human face animating method

A face and animation technology, applied in the field of image-based 3D face modeling and animation, can solve the problems of inability to real-time processing, large amount of calculation, unrealistic expressions, etc., to achieve realistic eye animation effect, strong sense of realism, and universal strong effect

Inactive Publication Date: 2012-03-14
北京盛开智联科技有限公司
View PDF2 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The first category is the expression animation method based on the MPEG-4 standard, which defines the movement paths of feature points under several common expressions. Although it is simple, the defect is that the generated expressions are not realistic enough.
In order to generate realistic expressions, the second category is based on the method of expression cloning, which obtains a 3D face model with expressions by scanning, and maps the expressions to the target face model; this type of method can generate subtle expressions , the disadvantage is that each vertex needs to be mapped, and the large amount of calculation makes it impossible to process in real time
The third category is based on key frame interpolation technology, which is more common in film and television production. The three-dimensional expression of the face is pre-designed, and the transition frame is obtained through interpolation; for each face model, its key frame expression needs to be redesigned. and requires the cooperation of professional artisans

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Single-photo-based human face animating method
  • Single-photo-based human face animating method
  • Single-photo-based human face animating method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The present invention will be described in detail below in conjunction with the accompanying drawings. It should be noted that the described embodiments are only intended to facilitate the understanding of the present invention, rather than limiting it in any way. The present invention is illustrated by the following examples:

[0025] Input a single front face photo, get the reconstruction result after face detection, face key point positioning, face geometric reconstruction, model expansion and other steps, after animation data production, animation data mapping, interpolation based on spherical parameterization, eyes Action processing and other steps to obtain personalized facial animation effects, the specific implementation process is as follows:

[0026] 1. Establishment of deformation model.

[0027] Use a 3D scanner to collect real 3D face data, perform regularization processing, and perform principal component analysis on the regularized model shape to obtain ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a single-photo-based human face animating method, which belongs to the field of graph and image processing and computer vision. The method is to automatically reconstruct a three-dimensional model of a human face according to a single human front face photo and then to drive the reconstructed three-dimensional model to form personal human face animation. The method uses a human three-dimensional reconstruction unit and a human face animation unit, wherein the human face three-dimensional reconstruction unit carries out the following steps: generating a shape-change model off-line; automatically positioning the key points on the human faces by utilizing an active appearance model; adding eye and tooth grids to form a complete human face model; and obtaining the reconstruction result by texture mapping. The human face animation unit carries out the following steps: making animation data of far spaced key points; mapping the animation data onto a target human face model by using a radical primary function; realizing motion data interpolation by using spherical parametrization; and generating the motion of eyes. The method has the characteristics of high automation, robustness and sense of reality and is suitable to be used in field of film and television making, three-dimensional games and the like.

Description

technical field [0001] The invention relates to the fields of graphic image processing and computer vision, in particular to image-based three-dimensional human face modeling and animation methods. Background technique [0002] Facial animation based on a single photo refers to synthesizing the facial expression animation of the person from a two-dimensional facial image. It is a research hotspot and difficulty in computer graphics, image processing, computer vision and other fields, attracting a large number of researchers to study it. Facial animation has broad application prospects, mainly including: 3D games, film and television production, human-computer interaction interface, telepresence, education, etc. The method of facial expression animation based on a single image can be divided into 2D expression animation based on image processing and 3D expression animation based on face modeling. Among them, 2D expression animation based on image processing is suitable for ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T13/40
Inventor 杜志军姚健曾祥永王阳生
Owner 北京盛开智联科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products