Emotion evaluation method and device based on virtual reality and eye movement information

A technology of virtual reality and eye movement information, applied in the field of medical devices, can solve the problem of low accuracy of emotion recognition and achieve the effect of improving accuracy

Pending Publication Date: 2021-12-24
SUZHOU ZHONGKE ADVANCED TECH RES INST CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The purpose of the present invention is to provide an emotion assessment method, device and virtual reality equipment based on virtual reality and eye movement information, aiming to solve the technical problem of low accuracy of emotion recognition in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion evaluation method and device based on virtual reality and eye movement information
  • Emotion evaluation method and device based on virtual reality and eye movement information
  • Emotion evaluation method and device based on virtual reality and eye movement information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] figure 1 It is an implementation flow chart of the emotion assessment method based on virtual reality and eye movement information shown in the first embodiment. The emotion assessment method based on virtual reality and eye movement information shown in Embodiment 1 is applicable to a specific virtual reality device, and the virtual reality device is equipped with a processor to collect the eye movement characteristic data of the subject to accurately realize the emotion evaluation of. For ease of description, only the parts related to the embodiments of the present invention are shown, and the details are as follows:

[0039] Step S110, in response to the emotion assessment operation on the virtual reality device, displaying a dynamic picture on the display interface of the virtual reality device.

[0040] Step S120, collect eye movement characteristic data of the subject following the dynamic picture.

[0041] Step S130, using a deep learning algorithm to input th...

Embodiment 2

[0059] Such as Figure 4 As shown, Embodiment 2 of the present invention provides an emotion assessment device based on virtual reality and eye movement information, which can perform all or part of the steps of any of the above-mentioned emotion assessment methods based on virtual reality and eye movement information . The unit includes:

[0060] The dynamic picture display device 1 is used to display the dynamic picture on the display interface of the virtual reality equipment in response to the emotional evaluation operation for the virtual reality equipment;

[0061] An eye movement characteristic data acquisition device 2, used to collect eye movement characteristic data 3 of the subject following the dynamic picture;

[0062] The matching operation device 4 is used to use a deep learning algorithm to input the eye movement feature data and the dynamic picture into the pre-selected training neural network model library for matching operation, and generate an emotional e...

Embodiment 3

[0064] Embodiment 3 of the present invention provides a virtual reality device, such as virtual reality glasses, and the virtual reality device can perform all or part of the steps of any one of the aforementioned emotion assessment methods based on virtual reality and eye movement information. The system includes:

[0065] processor; and

[0066] memory communicatively coupled to the processor; wherein,

[0067] The memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor, so that the at least one processor can execute the method described in any of the above exemplary embodiments. method, which will not be described in detail here.

[0068] In this embodiment, a storage medium is also provided, which is a computer-readable storage medium, for example, a temporary or non-transitory computer-readable storage medium including instructions. The storage medium, for example, includes a memory of instruc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention is suitable for the technical field of medical instruments, and provides an emotion evaluation method and device based on virtual reality and eye movement information and virtual reality equipment. The method comprises the steps of responding to an emotion evaluation operation for the virtual reality equipment, and displaying a dynamic picture on a display interface of the virtual reality equipment; collecting eye movement feature data of a tested person following the dynamic picture; and inputting the eye movement feature data and the dynamic picture into a pre-selected trained neural network model library for matching operation by adopting a deep learning algorithm, and generating a corresponding emotion evaluation report. By means of the virtual reality technology and the eye movement technology, the eye movement feature data of an individual are collected in advance, eye movement tracking trajectory features are analyzed, the difference between a depressor and a non-depressor in the eye movement trajectory is explored, the depression degree of a patient is highly related to the eye movement features of the patient, therefore, emotion is evaluated by tracking the eye movement trajectory, and the accuracy of emotion recognition is effectively improved.

Description

technical field [0001] The invention belongs to the technical field of medical devices, and in particular relates to an emotion assessment method, device and virtual reality equipment based on virtual reality and eye movement information. Background technique [0002] At present, the identification and evaluation of emotions are mostly carried out by doctors through mental health scales, according to some clinical diagnostic criteria and communication with patients. However, these evaluation methods are easily affected by subjective factors, and have the limitations of strong subjectivity and large error in results. Therefore, a more objective and convenient method is urgently needed. Contents of the invention [0003] The purpose of the present invention is to provide an emotion assessment method, device and virtual reality equipment based on virtual reality and eye movement information, aiming to solve the technical problem of low accuracy of emotion recognition in the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): A61B5/16A61B3/113A61B5/00G06F3/01G06K9/00
CPCA61B5/165A61B3/113A61B5/7264G06F3/013
Inventor 张雨青蒋曦
Owner SUZHOU ZHONGKE ADVANCED TECH RES INST CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products