Facial expression synthetic method based on feature points

A face expression and synthesis method technology, applied in the field of image processing, can solve the problem of not fully considering the special structure of the face mesh, and achieve the effect of small calculation amount, cost saving, and meeting the needs of real-time animation

Active Publication Date: 2013-04-10
DALIAN UNIV
View PDF3 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the face is a surface structure with an open area, many problems involving the distance between two points on the face are approximated by

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Facial expression synthetic method based on feature points
  • Facial expression synthetic method based on feature points
  • Facial expression synthetic method based on feature points

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] figure 1 Shown is the algorithm flow chart of the present invention, and it specifically comprises the following steps:

[0026] The first step: conversion of expression space

[0027] Establish the mapping relationship between the first frame of motion capture data and the target face model markers, the mapping relationship can be expressed as follows:

[0028]

[0029] said is the spatial coordinate of the marker point in the first frame motion capture sequence (x i ,y i ,z i ); the x i ,y i and z i The units are millimeters; is the geodesic distance between the i-th marker point and the j-th marker point in the first frame sequence, and the unit of the geodesic distance is millimeter; w j Is the weight coefficient to be sought; n is the number of markers, said n is an integer, and its value is 60 according to the number of markers initially set; is the spatial coordinate of the i-th marker point on the target face model (x i ,y i ,z i ); the x i ,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses facial expression synthetic arithmetic based on a hybrid transformation strategy. Transition from motion capture data space to target face model motion space is achieved by building a redirecting model of radial basis function facial expressions based on geodesic distance. On a stage of facial expression animation, the result of space transformation is utilized, and local motion of a peak is calculated through a weighted local transformation method based on adjacent feature points. Simultaneously, the whole situation displacement of the peak is calculated through a whole situation transformation method based on radial basis function interpolation. In the end, the final displacement of the peak is obtained by mixing a local displacement and the whole situation displacement. According to the facial expression synthetic arithmetic based on the hybrid transformation strategy, the same catching order can be applied to different face models, model transformation is easy, and simultaneously, different catching orders can be used on the same target model, reusing of motion capture data is achieved, and simultaneously strong animation third dimension is achieved. The algorithm flow chart of the facial expression synthetic arithmetic based on the hybrid transformation strategy is shown in an attached map 1.

Description

technical field [0001] The invention relates to a method for synthesizing human facial expressions based on feature points, which belongs to the technical field of image processing. Background technique [0002] The traditional way of human-computer interaction is based on mouse and keyboard. With the development of computer graphics and computer hardware technology, realistic facial animation technology provides a more convenient interface for human-computer interaction. In the field of entertainment, more and more realistic virtual people have been applied to various film and television works. In the field of education, since virtual human faces can produce various realistic facial expressions, it is easier to concentrate students' attention. Virtual faces can also play a role in customer relationship management. In addition, facial animation technology can also be applied to fields such as medical treatment, news broadcasting, advertising and psychology. In the field o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T13/40G06T15/00
Inventor 张强李蓓蓓魏小鹏
Owner DALIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products