Facial action unit strength estimation-based expression analysis method

A technology of facial movements and action units, applied in the field of expression analysis, which can solve the problems of low recognition accuracy

Inactive Publication Date: 2017-11-24
SHENZHEN WEITESHI TECH
View PDF0 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem of low recognition accuracy, the object of the present invention is to provide an expression analysis method based on facial action unit strength estimation. The structured depth conditional random field includes two settings. In the first setting, given the input facial image , which applies a predefined convolutional neural network layer to (normalize) the input image to generate a feature map, and the second setting uses data augmentation learning methods in order to leverage information from multiple datasets; The unary node potential is defined in the airport. When the random variable is discrete, the joint distribution can be constructed for the discrete variable. The objective function of the deep structured conditional random field generated by reinforcement learning from multiple data sets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Facial action unit strength estimation-based expression analysis method
  • Facial action unit strength estimation-based expression analysis method
  • Facial action unit strength estimation-based expression analysis method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0047] figure 1 It is a system frame diagram of an expression analysis method based on facial action unit strength estimation in the present invention. It mainly includes structured deep conditional random fields (CRF), unary potentials, paired potentials, learning and inference.

[0048] Unary potentials, let l ∈ {1,...,L} be the ordinal label of the intensity level of the qth AU, using the standard threshold model:

[0049]

[0050] Among them, β q is the ordinal projection vector, is the lower threshold of the count level l

[0051] Unary node potential, by assuming an error (noise) term ε q is with zero mean and variance (σ q ) 2 functions, their normal cumulative de...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a facial action unit strength estimation-based expression analysis method. The method mainly comprises the content of structured depth condition random field, one-element potential, paired potential, learning and deduction, and comprises the process that the structured depth condition random field has two settings, in the first setting, an input facial image is given, a pre-defined convolutional neural network layer is applied to the (normalization) input image to generate a characteristic graph, and the second setting is to use a data enhancement learning method to utilize the information coming from a plurality of data sets; the one-element node potential is defined in a structural depth condition random field, when the random variables are discrete, the joint distribution can be constructed for the discrete variables, and an objective function of the learned and generated structural depth condition random field is enhanced from the plurality of data sets. The expression analysis method of the present invention utilizes the deep structured learning, can process the high-dimension input characteristics, and enables the image characteristics to be improved substantially, the performances to be improved remarkably and the facial action unit strength estimation accuracy to be improved.

Description

technical field [0001] The invention relates to the field of expression analysis, in particular to an expression analysis method based on facial action unit strength estimation. Background technique [0002] The recognition and analysis of facial expressions is an important research direction in the field of artificial intelligence. It not only has universal significance in social life, but also plays an important role in computer emotional computing. It automatically recognizes peoples facial expressions, and then analyzes peoples emotions and psychological activities. It can be applied in the field of security. In public places, such as airports, subway stations, etc., it can automatically analyze peoples emotions through monitoring equipment such as cameras installed. Expressions and movements, through these analyzes to further judge the characters psychology, so as to judge suspicious characters or even terrorists, and then prevent their criminal behavior; it can also he...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/174
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products