Emotion recognition method based on context interaction relationship

An emotion recognition and context technology, applied in biometric recognition, character and pattern recognition, acquisition/recognition of facial features, etc., can solve the problem of ignoring the contextual emotional interaction, increasing the uncertainty of body or scene emotion, and reducing the model prediction ability, etc. question

Active Publication Date: 2021-07-06
SOUTH CHINA UNIV OF TECH
View PDF9 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006]Current work on emotion recognition combines contextual information to extract emotional clues, but mainly extracts emotional clues from the head, body and scene alone, ignoring the interactive relationship between contextual emotions, resulting in The emotional uncertainty of the body or scene increases, reducing the predictive ability of the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion recognition method based on context interaction relationship
  • Emotion recognition method based on context interaction relationship
  • Emotion recognition method based on context interaction relationship

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0072] Such as figure 1 As shown, this embodiment provides an emotion recognition method based on contextual interaction, including the following steps:

[0073] S1: Detect each picture in the collected data set, including face detection and human body detection, and obtain the face bounding box and human body bounding box;

[0074] In this embodiment, OpenPose is used for human body bounding box detection and key point detection and OpenFace is used for face bounding box detection and key point detection;

[0075] If there is no human body bounding box or human face bounding box in the recognition, the coordinates [upper left abscissa, upper left ordinate, lower right abscissa, lower right ordinate] of the human body bounding box are set to [0.25 times the image width, 0.25 times the image height, 0.75 times the image width, 0.75 times the image height], the coordinates of the face bounding box [upper left abscissa, upper left ordinate, right lower abscissa, right lower ordi...

Embodiment 2

[0116] The present embodiment provides an emotion recognition system based on context interaction relations, including: a bounding box extraction module, a picture preprocessing module, a training image tuple building module, a benchmark neural network building module, a benchmark neural network initialization module, and an interaction module building module , interaction module initialization module, feature splicing and fusion module, training module and testing module;

[0117] In this embodiment, the bounding box extraction module is used to perform face detection and human body detection on the pictures in the data set to obtain a human face bounding box and a human body bounding box;

[0118] In this embodiment, the picture preprocessing module is used to preprocess the pictures of the face bounding box and the human body bounding box, and divide each real picture into a human face picture, a body picture with a mask and a scene picture with a mask ;

[0119] In this e...

Embodiment 3

[0128] This embodiment provides a storage medium, the storage medium can be a storage medium such as ROM, RAM, magnetic disk, optical disk, etc., and the storage medium stores one or more programs. A method for emotion recognition based on context interaction.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an emotion recognition method based on a context interaction relationship. The method comprises the following steps: performing face detection and human body recognition on an expression data set to obtain a bounding box of a face and a body; preprocessing the pictures by using bounding boxes of the human face and the body, and generating space masks for the bounding boxes to obtain three types of pictures of the human face, the body and the scene; inputting the preprocessed images into three pre-trained branch networks to extract features, wherein a context interaction module is inserted into a second layer and a fourth layer of the network, and features of other branches in the interaction module are weighted and fused to each branch; and performing expression classification in combination with the face emotion features, the body emotion features and the scene emotion features to form an emotion recognition model based on a context interaction relationship. According to the method, the feature expression ability of the context is improved, the noise existing in the context is inhibited, the problems of emotion uncertainty and noise during independent extraction of context features are solved, and the emotion recognition accuracy is higher.

Description

technical field [0001] The invention relates to the technical field of image processing and recognition, in particular to an emotion recognition method based on contextual interaction relationships. Background technique [0002] Automatic emotion recognition technology is the ability for machines to perceive human emotional states, and has many applications in environments that require monitoring of humans, including education, medical care, and entertainment. The current mainstream emotion labeling model includes category labels and dimension labels. Category labels mainly refer to basic emotions: anger, happiness, surprise, disgust, sadness, and fear. Dimension labels mainly refer to the emotional coordinate space composed of arousal dimensions and positive and negative dimensions. [0003] Human expression is one of the characteristics that can best reflect emotions. Ekman believes that human expressions have commonalities, so researchers pay special attention to extracti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/174G06V40/172G06V40/161G06V40/168G06V40/10G06N3/045G06F18/253
Inventor 李新鹏丁长兴
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products