Unlock instant, AI-driven research and patent intelligence for your innovation.

Facial action unit recognition method and device based on adaptive attention and space-time correlation

A technology of spatio-temporal correlation and facial movements, applied in the field of computer vision, can solve the problems of no obvious contour and texture, neglect of specificity and dynamics, low recognition accuracy, etc., to achieve improved robustness, strong robustness, high The effect of recognition accuracy

Active Publication Date: 2022-08-02
CHINA UNIV OF MINING & TECH
View PDF7 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the automatic recognition of facial action units has made impressive progress, current AU detection methods based on region learning only utilize AU labels because AUs do not have obvious contours and textures and may vary with people and expressions. To supervise neural networks to adaptively learn implicit attention that often captures irrelevant regions
However, in the AU detection method based on relational reasoning, all AUs share parameters during inference, ignoring the specificity and dynamics of each AU, so that the recognition accuracy is not high, and there is room for further improvement.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Facial action unit recognition method and device based on adaptive attention and space-time correlation
  • Facial action unit recognition method and device based on adaptive attention and space-time correlation
  • Facial action unit recognition method and device based on adaptive attention and space-time correlation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.

[0063] In the description of the present invention, it should be understood that the terms "center", "portrait", "horizontal", "top", "bottom", "front", "rear", "left", "right", " The orientation or positional relationship indicated by vertical, horizontal, top, bottom, inner, outer, etc. is based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and The description is simplified rather than indicating or implying that the device or element referred to must have a particular orientation, be constructed and operate in a particular orientation, and therefore should not be construed as limiting the invention. Furthermore, the terms "first" and "second" are used for descriptive purposes only and should not be construed to indicate or imply relative importance.

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a facial action unit recognition method and device based on adaptive attention and space-time correlation, and the method comprises the steps: firstly extracting original continuous image frames needed by model training from video data to form a training data set, and then carrying out the preprocessing of the original image frames, and obtaining an amplified image frame sequence; a convolutional neural network module I is constructed to extract layered multi-scale region features of the amplified image frame, a convolutional neural network module II is constructed to carry out adaptive attention regression learning of the facial action unit, and an adaptive space-time diagram convolutional neural network module III is constructed to learn space-time association of the facial action unit; and finally, constructing a full connection layer module IV to perform facial action unit recognition. According to the method, an end-to-end deep learning framework is adopted for learning action unit recognition, the motion condition of facial muscles in a two-dimensional image can be effectively recognized by utilizing the mutual dependency relationship and the space-time correlation between facial action units, and construction of a facial action unit recognition system is achieved.

Description

technical field [0001] The invention relates to a facial action unit recognition method and device based on adaptive attention and space-time correlation, and belongs to computer vision technology. Background technique [0002] In order to study human facial expressions more carefully, the famous American emotional psychologist Ekman first proposed the Facial Action Coding System (FACS) in 1978, and made important improvements in 2002. The facial action coding system is divided into several facial action units that are both independent and interconnected according to the anatomical characteristics of the face. The facial expression can be reflected by the action features of these facial action units and the main areas they control. [0003] With the development of computer technology and information technology, deep learning technology has been widely used. In the field of AU (Facial Action Unit) recognition, research on AU recognition based on deep learning models has beco...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/16G06V10/82G06V10/774G06N3/04G06N3/08
CPCG06V40/16G06V40/171G06V10/82G06V10/774G06N3/08G06N3/047G06N3/048G06N3/045Y02T10/40
Inventor 邵志文周勇陈浩于清
Owner CHINA UNIV OF MINING & TECH