Facial expression recognition method based on joint learning identity information and emotion information

A facial expression and identity information technology, applied in the field of computer vision and affective computing, to achieve the effect of performance improvement, significant system performance, and improved robustness

Inactive Publication Date: 2019-02-19
DUKE KUNSHAN UNIVERSITY +1
View PDF7 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In other words, even with the same facial expression attribute,

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Facial expression recognition method based on joint learning identity information and emotion information
  • Facial expression recognition method based on joint learning identity information and emotion information
  • Facial expression recognition method based on joint learning identity information and emotion information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] Embodiment one: use the present invention to test on Extended Cohn-Kanade (CK+) data.

[0038] Step 1: First use the CASIA-WebFace face recognition database to train the sub-network used to extract face identity information. CASIA-WebFace contains a total of 494,414 images of 10,757 individuals. At the same time, use the LabeledFaces in the Wild (LFW) dataset to evaluate the accuracy of face recognition. The structure of the sub-network includes multiple convolutional layers and pooling layers, and finally a 160-dimensional feature vector of face identity information can be extracted. After training and tuning, the network can achieve 91% accuracy on the LFW dataset. Since our ultimate goal is not to perform face verification, we do not optimize the face verification performance too much.

[0039] Step 2: Use the CK+ facial expression database to train the sub-network used to extract facial expression information. The CK+ database contains 327 image sequences with f...

Embodiment 2

[0042] Embodiment two: use the technology of the present invention to test on FER+ data

[0043] In the first step, consistent with Embodiment 1, first use the CASIA-WebFace face recognition database to train a sub-network for extracting face identity information. CASIA-WebFace contains a total of 494,414 images of 10,757 individuals. At the same time, use the Labeled Faces in the Wild (LFW) dataset to evaluate the accuracy of face recognition. The structure of the sub-network includes multiple convolutional layers and pooling layers, and finally a 160-dimensional feature vector of face identity information can be extracted. After training and tuning, the network can achieve 91% accuracy on the LFW dataset. Since our ultimate goal is not to perform face verification, we do not optimize the face verification performance too much.

[0044] In the second step, since the amount of data in the FER+ dataset has increased significantly compared to the CK+ dataset, the ResNet 18-la...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a facial expression recognition method based on joint learning identity information and emotion information, which comprises a face recognition image database and a facial expression image database, wherein the facial identity information network branch is independently trained by using the face recognition image database, the last full connection layer is removed after thetraining, and the identity feature vector of the input image can be extracted through the neural network; The facial expression information network branch is trained by facial expression image database. After the whole connection layer is removed, the affective feature vector of the input image can be extracted by neural network. Identity feature vector and affective feature vector are concatenated to get concatenated facial feature expression. The cascaded facial expression features fusing the identity information and the facial information are fed to the full connection layer, and the subsequent training only uses the facial expression image database for joint learning and optimization of the merged network. The present invention improves the robustness of the facial expression recognition method to self-differences between subjects.

Description

technical field [0001] The present invention relates to the fields of computer vision and emotional computing, and more specifically, to a facial expression recognition method based on joint learning of identity information and emotional information. Background technique [0002] With the rapid development of computer technology and artificial intelligence technology and related disciplines, the degree of automation of the whole society is constantly improving, and people's demand for human-computer interaction similar to the way people communicate is becoming stronger and stronger. Facial expression is the most direct and effective emotion recognition mode. It has many applications in human-computer interaction. If computers and robots can understand and express emotions like humans, it will fundamentally change the relationship between humans and computers and enable computers to serve humans better. Facial expression recognition is the basis of emotion understanding, th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/174G06V40/168G06F18/214
Inventor 李明邹小兵
Owner DUKE KUNSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products