Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Emotion detection method based on multi-task and multi-label residual neural network

A neural network and detection method technology, applied in the field of service computing, can solve the problems of low neural network accuracy and small amount of available data, and achieve the effect of improving multi-task methods and good versatility

Pending Publication Date: 2019-06-21
SHANDONG UNIV OF SCI & TECH
View PDF3 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the amount of data available for this task is small, making it inaccurate to exploit these data-intensive neural networks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion detection method based on multi-task and multi-label residual neural network
  • Emotion detection method based on multi-task and multi-label residual neural network
  • Emotion detection method based on multi-task and multi-label residual neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0050] Below is the specific embodiment of the application of the present invention:

[0051]Collect dataset documents, dataset documents include two datasets: SFEW2.0 dataset and EmotioNet dataset. Among them, the SFEW2.0 data set contains 958 training data, 436 verification data, and 372 test data. These include seven emotion categories. Whereas the EmotioNet dataset includes 25,000 pieces of facial expression data with manual annotations, from which 2,000 pieces of data are annotated with seven emotion categories.

[0052] Executing step 1, the process of labeling the image data is as follows: label the emotion "angry" as "1", label the emotion as "disgust" as "2", and label the emotion as "fear" is "3", the label with the emotion "happy" is marked "4", the label with the emotion "sad" is marked "5", the label with the emotion "surprise" is marked "6", and the label with the emotion is "Neutral" is labeled "7";

[0053] Execute step 2 and use the one-hot tool to convert...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an emotion detection method based on a multi-task and multi-label residual neural network, belongs to the technical field of service calculation, and utilizes the advantages ofa cross-domain data set to carry out emotion polarity detection on data sets in different fields. In the present invention, a common feature representation is shared with other related tasks by applying a multi-task learning loss function. In particular, the benefit of emotion recognition is shown by combining a learning model and a detector of a facial action unit. The proposed loss function solves the problem of learning a plurality of tasks by using heterogeneous tagged data, and improves the previous multi-task method. An SFEW2.0 data set and an EmotioNet data set are utilized to carry out experiments on a plurality of tasks and a plurality of tags so as to carry out emotion detection and classification. The result analysis shows that the method has good universality, the precision reaches 92.1%, and the method can be qualified for most detection tasks.

Description

technical field [0001] The invention relates to the technical field of service computing, in particular to an emotion detection method based on a multi-task and multi-label residual neural network. Background technique [0002] Facial images provide information about emotional perception. As humans, we can infer accurate first impressions of someone's emotions just by looking at their face. A variety of applications benefit from automatic facial emotion recognition, such as human-computer interaction, student engagement estimation, emotion-aware devices, or the improvement of expressions produced by people with autism. And new methods based on deep learning can improve facial expression recognition tasks. However, the amount of data available for this task is small, making it inaccurate to exploit these data-intensive neural networks. Therefore, the introduction of multi-task learning is particularly important as it demonstrates that the performance of a single task can b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/08
Inventor 田刚刘鹏飞王琦博孙承爱
Owner SHANDONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products