A robot-oriented multi-modal fusion emotion computing method and system

A kind of emotional computing, multi-modal technology, applied in the field of information, can solve the problem of no fusion, no multi-modal emotional computing for robots, no fusion method, etc.

Active Publication Date: 2021-12-14
XIAMEN UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] From the current point of view, there are few related studies on multi-modal fusion. The current method does not achieve the fusion of multi-modal information, and most of them are language partial information.
Most of the research has the following defects: 1. It is limited to the information collection and acquisition of a certain mode; 2. It only recognizes the language part, and cannot recognize the user's emotion well; 3. The non-language information part is only for the interaction object. Facial expressions are used for emotional computing, but various signals such as physiological information, facial expressions, body language, and visual information are not accurately integrated; 4. There is no fusion method of language and non-linguistic multimodal information, and the corresponding emotional computing method ; 5. Robots basically do not use multi-modal affective computing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A robot-oriented multi-modal fusion emotion computing method and system
  • A robot-oriented multi-modal fusion emotion computing method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] see figure 1 with figure 2 , the present invention is a kind of robot-oriented multimodal fusion emotion calculation method, comprising the following steps:

[0050] Step 1. Obtain multi-modal information, by capturing the language information and non-language information of people interacting with the robot in real time, including facial expressions, head and eye attention, gestures and text;

[0051] Step 2. Construct different information processing channels for feature classification and identification, including feature classification and identification of linguistic information and non-linguistic information;

[0052] Step 3, process the multimodal information, and map the information to the PAD three-dimensional space through the PAD model (P-pleasure, A-arousal, D-dominance) and the OCC model;

[0053]Step 4. Perform temporal alignment of each modal information when merging at the decision-making layer, and perform calculation of emotional dimension space bas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a robot-oriented multi-modal fusion emotion computing method, including: acquiring multi-modal information, capturing the language information and non-linguistic information of people interacting with the robot in real time; constructing processing channels for different information to perform feature classification and recognition, including feature classification and recognition of language information and non-linguistic information; process multi-modal information, and map information to PAD three-dimensional space through PAD model and OCC model; carry out the fusion of each modal information at the decision-making level Timing alignment, to calculate the emotional dimension space based on timing. The present invention also provides a robot-oriented multi-modal fusion emotional computing system. By acquiring the user's multi-modal information, including non-linguistic and linguistic information, using the PAD model, OCC model, and linear regression model to respectively After the language information is fused, the linear regression model is used for final fusion to achieve better and more accurate emotional computing for robots.

Description

technical field [0001] The invention relates to the field of information technology, in particular to a robot-oriented multimodal fusion emotion computing method and system. Background technique [0002] From the current point of view, there are few related studies on multi-modal fusion. The current methods have not achieved the fusion of multi-modal information, and most of them are language partial information. Most of the research has the following defects: 1. It is limited to the information collection and acquisition of a certain mode; 2. It only recognizes the language part, and cannot recognize the user's emotion well; 3. The non-language information part is only for the interaction object. Facial expressions are used for emotional computing, but various signals such as physiological information, facial expressions, body language, and visual information are not accurately integrated; 4. There is no fusion method of language and non-linguistic multimodal information, a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/174G06V40/113G06V40/103G06F18/25
Inventor 佘莹莹陈锦舒杨
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products