Expression interaction method based on face tracking and analysis

An expression and face technology, which is applied in the field of face-based tracking and expression interaction, can solve problems such as incompatibility, and achieve the effects of strong versatility, robust face tracking, and fast processing speed

Inactive Publication Date: 2012-05-30
北京盛开智联科技有限公司
View PDF3 Cites 66 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But this method is not universal, and the parameters

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Expression interaction method based on face tracking and analysis
  • Expression interaction method based on face tracking and analysis
  • Expression interaction method based on face tracking and analysis

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be described in detail below with reference to the accompanying drawings. It should be noted that the described embodiments are only intended to facilitate the understanding of the present invention and do not have any limiting effect on it. The present invention is illustrated by the following embodiments:

[0019] Making 3D models, training active appearance models, initializing face tracking, setting energy function in face tracking and expression analysis, detecting eyes open and closed, and driving 3D models. The specific implementation process is as follows:

[0020] 1. Make a 3D model

[0021] The production of 3D models belongs to the offline preprocessing stage. The purpose is to design a 3D face model and corresponding models in 14 expression states. Some models such as figure 2 Shown. In the expression interaction, let the model simulate the performer's expression in real time. In the present invention, 14 basic expressions are divided...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an expression interaction method based on face tracking and analysis, belonging to the field of figure and image processing and computer vision. The method comprises the following steps of: collecting an expression image of a face by a camera; carrying out analysis processing on a captured face image in time according to a face tracking and expression analysis technology, so as to track the face and extract expression parameters; and then driving a target three-dimensional face model according to the extracted expression parameters to do same expression animations. The expression interaction method provided by the invention has the advantages of strong automation, robustness and interactivity and is applicable to the fields of film production, three-dimensional games, interactive multimedia and the like.

Description

Technical field [0001] The invention relates to the field of graphic image processing and computer vision, in particular to a method for tracking and expression interaction based on human face. Background technique [0002] Expression interaction refers to the technology that drives virtual characters to make similar expressions by capturing facial expressions in real time. It has a wide range of applications in human-computer interactive virtual games, virtual human broadcasts, 3D film and television production, etc.; such as the 3D movie "Avatar" The expression interactive technology is used to make the expression animation of the Na'vi people. The expression interaction based on facial expression tracking and analysis proposed in the present invention refers to using a camera to collect images in real time, using a face tracking algorithm to track the human face in the video, and analyzing the facial expression parameters in each frame; Use the extracted expression parameters...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T13/20G06T7/00G06K9/46
Inventor 姚健曾祥永杜志军王阳生
Owner 北京盛开智联科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products