Video-based emotion recognition method and device
A technology of emotion recognition and video, applied in the field of emotion recognition, can solve problems such as poor video quality, poor classification and recognition effect, insufficient high-dimensional emotion recognition, etc., and achieve the effect of improving accuracy
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0029] figure 1 It is a flow chart of a video-based emotion recognition method provided in Embodiment 1 of the present invention. This embodiment is applicable to the situation of video emotion classification, and the method can be executed by a video-based emotion recognition device, which can be Realized by means of software and / or hardware, the apparatus can be configured in a terminal device. Specifically include the following steps:
[0030] S110. Determine the initial characteristic data of the video to be recognized;
[0031] Among them, the initial feature data can be used for emotional classification of the video to be recognized, so as to realize the emotion recognition of the video to be recognized. In an embodiment, optionally, the video to be recognized is preprocessed to obtain initial feature data. In an embodiment, optionally, the initial feature data includes at least one item of RGB image data, optical flow image data, audio data, and text data.
[0032] ...
Embodiment 2
[0051] image 3 It is a flow chart of a video-based emotion recognition method provided by Embodiment 2 of the present invention. The technical solution of this embodiment is further refined on the basis of the above embodiments. Optionally, the object relationship recognition model is also used to: construct an attention map corresponding to the object relationship feature for each object relationship feature, and calculate an activation degree corresponding to the attention map according to an energy function; based on For each of the activation degrees, the object relationship features corresponding to the activation degrees exceeding the preset threshold are obtained.
[0052] S210. Determine the initial characteristic data of the video to be recognized;
[0053] Wherein, the initial feature data includes at least one of RGB image data, optical flow image data, audio data and text data;
[0054] S220. Input the RGB image data into the object relationship recognition mode...
Embodiment 3
[0077] Figure 4 It is a flow chart of a video-based emotion recognition method provided by Embodiment 3 of the present invention. The technical solution of this embodiment is further refined on the basis of the above embodiments. Optionally, the training method of the emotion classification model includes: based on the object relationship recognition model and the feature extraction model, determining the object relationship features to be trained and the video features to be trained; the object relationship features to be trained and the video features to be trained are respectively mapped to the relationship feature emotion space and the video feature emotion space; the feature loss function is determined based on the mapped object relationship features to be trained and the video features to be trained, and based on the feature loss function and classification loss function. The parameters of the emotion classification model are adjusted to obtain the trained emotion class...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


