Unlock instant, AI-driven research and patent intelligence for your innovation.

Video-based emotion recognition method and device

A technology of emotion recognition and video, applied in the field of emotion recognition, can solve problems such as poor video quality, poor classification and recognition effect, insufficient high-dimensional emotion recognition, etc., and achieve the effect of improving accuracy

Inactive Publication Date: 2020-08-07
SHANGHAI JILIAN NETWORK TECH CO LTD
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

User-generated videos are not professionally edited and offer more variety than cinematic videos, but video quality is often lower
Therefore, the visual information learned only through the neural network is still not enough for high-dimensional emotion recognition, and the effect of classification recognition is often poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video-based emotion recognition method and device
  • Video-based emotion recognition method and device
  • Video-based emotion recognition method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] figure 1 It is a flow chart of a video-based emotion recognition method provided in Embodiment 1 of the present invention. This embodiment is applicable to the situation of video emotion classification, and the method can be executed by a video-based emotion recognition device, which can be Realized by means of software and / or hardware, the apparatus can be configured in a terminal device. Specifically include the following steps:

[0030] S110. Determine the initial characteristic data of the video to be recognized;

[0031] Among them, the initial feature data can be used for emotional classification of the video to be recognized, so as to realize the emotion recognition of the video to be recognized. In an embodiment, optionally, the video to be recognized is preprocessed to obtain initial feature data. In an embodiment, optionally, the initial feature data includes at least one item of RGB image data, optical flow image data, audio data, and text data.

[0032] ...

Embodiment 2

[0051] image 3 It is a flow chart of a video-based emotion recognition method provided by Embodiment 2 of the present invention. The technical solution of this embodiment is further refined on the basis of the above embodiments. Optionally, the object relationship recognition model is also used to: construct an attention map corresponding to the object relationship feature for each object relationship feature, and calculate an activation degree corresponding to the attention map according to an energy function; based on For each of the activation degrees, the object relationship features corresponding to the activation degrees exceeding the preset threshold are obtained.

[0052] S210. Determine the initial characteristic data of the video to be recognized;

[0053] Wherein, the initial feature data includes at least one of RGB image data, optical flow image data, audio data and text data;

[0054] S220. Input the RGB image data into the object relationship recognition mode...

Embodiment 3

[0077] Figure 4 It is a flow chart of a video-based emotion recognition method provided by Embodiment 3 of the present invention. The technical solution of this embodiment is further refined on the basis of the above embodiments. Optionally, the training method of the emotion classification model includes: based on the object relationship recognition model and the feature extraction model, determining the object relationship features to be trained and the video features to be trained; the object relationship features to be trained and the video features to be trained are respectively mapped to the relationship feature emotion space and the video feature emotion space; the feature loss function is determined based on the mapped object relationship features to be trained and the video features to be trained, and based on the feature loss function and classification loss function. The parameters of the emotion classification model are adjusted to obtain the trained emotion class...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a video-based emotion recognition method and device. The method comprises the steps of determining initial feature data of a to-be-recognized video; and inputting the initial feature data into a pre-trained emotion recognition model to obtain an emotion recognition result corresponding to the to-be-recognized video, wherein the emotion recognition model comprises an object relationship recognition model, a feature extraction model and an emotion classification model, the object relationship recognition model is used for recognizing an object relationship in the to-be-recognized video, the feature extraction model is used for extracting at least one video feature of the initial feature data, and the emotion classification model is used for determining the emotion recognition result of the to-be-recognized d video based on the object relationship and the video feature. According to the embodiment of the invention, the object relationship recognition model is added into the emotion recognition model, so that the problem of poor video emotion recognition effect is solved, and a more comprehensive recognition framework is provided for video emotion recognition.

Description

technical field [0001] Embodiments of the present invention relate to the technical field of emotion recognition, in particular to a video-based emotion recognition method and device. Background technique [0002] With the rapid development of mobile devices and the Internet, video content understanding has become a growing need. Many researchers have done a lot of research on tasks such as video action recognition and detection. However, the expression of emotion in a video is an important part of video understanding. Intuitively, it is to divide the video into different emotional categories according to the content, such as happy, surprised or sad. Video emotion recognition has many applications in actual production. For example, an advertisement recommendation system can avoid recommending inappropriate advertisements by matching the emotions in advertisements and videos. [0003] Early research on emotion recognition focused on text emotion recognition and image emotio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/00
CPCG06V20/40G06V20/46G06F18/24G06F18/214
Inventor 徐宝函
Owner SHANGHAI JILIAN NETWORK TECH CO LTD