Method for generating animation based on human face image, and calculating equipment

A face image and animation generation technology, applied in the field of image processing, can solve problems such as difficulty in achieving animation effects, inability to integrate non-face part background areas and face areas well, etc., so as to solve cumbersome operations and image deformation. Effect

Active Publication Date: 2017-11-24
厦门美图宜肤科技有限公司
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the existing 3D face model reconstruction technology only considers the face area more, and ignores other non-face areas, such as hair, neck, shoulders and other body parts, and often only deducts the face area for processing. The non-face background area adopts a constant processing scheme, so that when the dynamically processed image is synthesized, the non-face part background area and the face area cannot be well blended
Especially when multiple such dynamically processed images are used to form a video, it is difficult to achieve an overall natural and smooth animation effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for generating animation based on human face image, and calculating equipment
  • Method for generating animation based on human face image, and calculating equipment
  • Method for generating animation based on human face image, and calculating equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0027] figure 1 is a block diagram of an example computing device 100 . In a basic configuration 102 , computing device 100 typically includes system memory 106 and one or more processors 104 . A memory bus 108 may be used for communication between the processor 104 and the system memory 106 .

[0028] Depending on the desired configuration, processor 104 may be any type of processing including, but not limited to, a microprocess...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for generating an animation based on a human face image, and the method comprises the steps: extracting human face feature points of the human face image; generating a human face three-dimensional model through the human face feature points and a first projection matrix corresponding to the human face three-dimensional model; calculating and obtaining a full-image three-dimensional grid model of the human face image through the human face three-dimensional model; calculating full-image texture coordinates according to the full-image three-dimensional grid model and the first projection matrix; generating a second projection matrix through modifying the parameters of the first projection matrix; carrying out the projection of the full-image three-dimensional grid model and the human face three-dimensional model through the first projection matrix and the second projection matrix, and generating an image after three-dimensional reconstruction; carrying out the texture mapping of the image through the full-image texture coordinates after three-dimensional reconstruction, and generating a processed image; sequentially carrying out the steps of parameter modifying, projection and texture mapping in a repeated manner, and obtaining a plurality of processed images; and generating an animation according to the plurality of processed images. The invention also provides calculating equipment executing the above method.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a method and computing device for generating animation based on a face image. Background technique [0002] In the process of daily shooting and socializing, users hope to make some dynamic adjustments to the faces in an image to increase the fun. [0003] Currently, the techniques for dynamically adjusting face images are mainly based on two-dimensional image processing transformation and three-dimensional face model reconstruction. Only the processing on two-dimensional images often has problems such as facial distortion and strong sense of plane, so more and more technologies use methods based on three-dimensional face model reconstruction. The existing 3D face model reconstruction technology is mainly based on the linear combination of 3D face model data. The combination parameters are obtained through the projection matching of the 2D face and the corresponding 3D ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T13/40G06T17/20
CPCG06T13/40G06T17/20
Inventor 戴吟臻李志阳吕仰铭张伟李启东洪炜冬
Owner 厦门美图宜肤科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products