Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Modified real-time emotion recognition method and system based on eye movement data

An emotion recognition and data technology, applied in the field of emotion recognition, can solve problems such as inability to accurately recognize emotions

Active Publication Date: 2020-06-26
SOUTH CHINA UNIV OF TECH
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Human emotion is a psychological and physiological state produced by a variety of feelings, thoughts and behaviors, and there is an emotional arousal effect, which cannot accurately identify emotions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Modified real-time emotion recognition method and system based on eye movement data
  • Modified real-time emotion recognition method and system based on eye movement data
  • Modified real-time emotion recognition method and system based on eye movement data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0045] Such as figure 1 As shown, a modified real-time emotion recognition method based on eye movement data includes the following steps:

[0046] S1 integrates an eye movement data collection module in a head-mounted VR device. The user wears the VR device to independently explore the content of the 360° panoramic video, collects the user's eye movement data in the process in real time, and obtains the video frame sequence at the same time;

[0047] The eye movement data collection module in this embodiment is specifically an eye movement film, and the eye movement data includes eye patterns, pupil radius, pupil position in the image, upper and lower eyelid distances, fixation points (smooth and non-smooth), etc.

[0048] The beneficial effects of adopting the above solution are: VR immersive experience makes users more immersive, users are less susceptible to interference from the external environment, integrated eye tracking module in the VR headset, the collected data is real-ti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a modified real-time emotion recognition method and system based on eye movement data, and the method comprises the steps: collecting the eye movement data in a VR environmentin real time, and carrying out the prediction of a fixation point through the eye movement data; obtaining a preliminary emotional state through the panorama of the current frame and the predicted fixation point region map of the next frame; and correcting the initial emotional state in combination with historical eye movement data to obtain a final emotional state at the current moment. The system can predict the emotion of the user in real time and improve the experience of the user in the VR environment.

Description

Technical field [0001] The invention relates to the field of emotion recognition, in particular to a modified real-time emotion recognition method and system based on eye movement data. Background technique [0002] With the popularization of virtual reality technology applications, human-computer interaction has received more and more attention. Human-computer interaction in the VR environment, such as eye movement interaction, voice interaction, gesture interaction, and posture interaction, are gradually becoming mature. At present, in terms of emotional interaction, most domestic and foreign researches are based on physiological parameters such as facial expressions, heart rate, EEG, etc. However, the research on emotional interaction based on eye movements in VR is rare, and related research methods also have obvious shortcomings. But in fact, according to psychological research, eyes can best reflect a person's mental state and emotions. [0003] In a virtual reality environ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06F3/01G06K9/62G06N3/04G06N3/08
CPCG06F3/015G06N3/08G06N3/044G06N3/045G06F2218/08G06F2218/12G06F18/2411Y02D10/00
Inventor 青春美金珊徐向民邢晓芬
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products