Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

36 results about "Emotion assessment" patented technology

Wearable multi-mode emotional state monitoring device

PendingCN112120716ARealize multi-angle real-time monitoringEmotional state monitoringInput/output for user-computer interactionSensorsEmotion assessmentMedicine
The invention discloses a device based on wearable multi-mode emotion monitoring. The device comprises VR glasses, a wearable multi-mode signal acquisition module and an intelligent calculation module, wherein the VR glasses are used for establishing an emotion induction scene of an intelligent interactive real social scene; a wearable multi-mode emotion acquisition module acquires multi-mode physiological information of electroencephalogram, myoelectricity, electrocardio, dermatoelectricity, eye images and mouth images from the head, the faces, the chest and the wrists of a wearer; and the intelligent calculation module is used for preprocessing multi-dimensional signals, performing feature abstraction on the multi-mode heterogeneous data, performing cooperative representation and fusionon multi-source features, performing multi-task regression learning by using a multi-layer perceptron model, and finally, performing multi-dimensional emotion judgment and result output. According tothe invention, the problems of no quantitative analysis, no test equipment and the like in traditional emotion evaluation are solved, and a reliable experimental paradigm, a mechanism theory and an equipment environment are provided for evaluating and monitoring the multi-dimensional emotion.
Owner:NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI +1

Emotion assessment method and device based on mass sample data

The invention provides an emotion assessment method and an emotion assessment device based on mass sample data. The emotion assessment method based on the mass sample data comprises the steps of determining feature information of products in any appointed field in sample information; generating training data according to the feature information and a preset sample information training model; generating assessment information of the products in any appointed field within different time slots according to the training data and a preset increment-based text classification model; using word2vec toperform vectorization treatment on assessment information to generate a vector matrix to be assessed; inputting the vector matrix to be assessed into a convolutional neural network, and thus acquiring an emotion value of the assessment information according to an output result of the convolutional neural network. According to the technical scheme of the method and device provided by the invention, the accuracy and effectiveness of the user for acquiring the emotion value of the assessment information for the products in the specific field are improved, and according to the acquired assessmentanalysis results of the different products, the user can better select the product or make the more reasonable product marketing method.
Owner:NEW FOUNDER HLDG DEV LLC +1

Human-computer interaction method and device for emotion regulation

ActiveCN113687744ARich and varied interactive contentImprove emotional experienceCharacter and pattern recognitionMental therapiesEvaluation resultEmotion assessment
The invention discloses a man-machine interaction method and device for emotion regulation. The method comprises the following steps: calculating operation time of a human-computer interaction scheme according to an emotion evaluation result of a user, an evaluation result of a psychological scale before interaction and facial emotion data; performing emotion regulation after acquiring physiological data of the user before interaction, and acquiring objective feedback data of subjective feelings of the user and changes before and after interaction after interaction on that day is completed; when the number of interacted days reaches the operation time node of the current man-machine interaction scheme, calling out the post-interaction psychological scale for effectiveness evaluation, comparing an evaluation result with norm data, and if the evaluation result passes, executing the next step or returning to the first step; otherwise, updating the operation time and the operation time node of the current man-machine interaction scheme, and continuing man-machine interaction until the evaluation result passes; and according to subjective and objective feedback data of the user, updating the operation time of each remaining man-machine interaction scheme until the man-machine interaction of the round is finished.
Owner:BEIJING WISPIRIT TECH CO LTD

Tension evaluation system for counters

PendingCN111956243AWill not increase the psychological burdenImprove the accuracy of sentiment analysisDiagnostic signal processingRespiratory organ evaluationPhysical medicine and rehabilitationEmotion assessment
The invention discloses a tension evaluation system for counters. Thermal imaging processing and application are included. The system comprises an acquisition module, an emotion evaluation unit, a display module and a storage module; the acquisition module is hidden on a counter surface to collect facial information of a detected person; the emotion evaluation unit receives the facial information,transmitted by the acquisition module, of the detected person, analyzes breathing frequency of the detected person and judges the emotional tension degree of the detected person in combination with temperature distribution of the nose; the display module receives information, transmitted by the emotion evaluation module, of the emotional tension degree of the detected person, and displays the emotional tension degree of the detected person in the form of percentile score; and the storage module stores data of the facial information, transmitted by the acquisition module, of the detected person, and gives the emotional tension degree by the percentile score. A detection person is reminded to conduct more detailed examination or cross-examination on the detected person with tension, and a basis is provided for finding suspects with illegal intentions.
Owner:DALIAN UNIV OF TECH

Data processing method and device for evaluating test driving experience, facility and storage medium

ActiveCN110797050AGuaranteed accuracyAccurately reflect the experience effectSpeech analysisEmotion assessmentEngineering
The invention provides a data processing method and a data processing device for evaluating test driving experience, a facility and a storage medium. The method comprises the following steps: acquiring voice information in a vehicle during the test driving process, and seat information corresponding to the voice information, wherein the seat information is used for representing a seat where a person generating the voice information is located; recognizing the emotion in the voice information, thus obtaining initial emotion evaluating data; and according to the initial emotion evaluating data of the voice information and the seat information, determining final emotion evaluating data of the test driving experience. In the technical scheme, the accuracy of the evaluating result can not be influenced by the memory ability and comprehension ability of the person, and also is seldom influenced by the expression ability, and thus the evaluating accuracy can be guaranteed to a certain extent.Meanwhile, the specific circumstances of people on the different seats are also sufficiently considered, and thus the integral experience effect under the condition that multiple persons participatein test driving is more accurately reflected.
Owner:上海能塔智能科技有限公司

Language emotion recognition method for emotion evaluation

PendingCN114299995AEase of evaluationAssessing good language emotion recognitionSpeech recognitionEmotion assessmentIntense emotion
The invention discloses a language emotion recognition method for emotion evaluation, and belongs to the technical field of voice signal intelligent processing. The method comprises the following steps: pre-recording dialogue content to generate source audio, preprocessing the source audio, and storing to obtain an emotion database; dividing the emotion database into a training set and a test set; building a voice emotion recognition model based on the emotion database; the speech emotion recognition model predicts an emotion database through the pleasure degree and the passion degree; obtaining speech content of a speaker and preprocessing the speech content to generate a corresponding audio file; segmenting the audio file into a plurality of target audio files by taking the training set voice duration as a segmentation parameter; the target voice file is used as an input voice emotion recognition model, the emotion of a speaker is evaluated and analyzed based on the voice emotion recognition model, and the test set acts on the training set to optimize the voice emotion recognition model. The emotion change of the language of the speaker is accurately mastered, and the problem that the emotion change is neglected when only the audio is recognized is avoided.
Owner:UNIV OF SHANGHAI FOR SCI & TECH

Emotion evaluation method and device based on virtual reality and eye movement information

The invention is suitable for the technical field of medical instruments, and provides an emotion evaluation method and device based on virtual reality and eye movement information and virtual reality equipment. The method comprises the steps of responding to an emotion evaluation operation for the virtual reality equipment, and displaying a dynamic picture on a display interface of the virtual reality equipment; collecting eye movement feature data of a tested person following the dynamic picture; and inputting the eye movement feature data and the dynamic picture into a pre-selected trained neural network model library for matching operation by adopting a deep learning algorithm, and generating a corresponding emotion evaluation report. By means of the virtual reality technology and the eye movement technology, the eye movement feature data of an individual are collected in advance, eye movement tracking trajectory features are analyzed, the difference between a depressor and a non-depressor in the eye movement trajectory is explored, the depression degree of a patient is highly related to the eye movement features of the patient, therefore, emotion is evaluated by tracking the eye movement trajectory, and the accuracy of emotion recognition is effectively improved.
Owner:SUZHOU ZHONGKE ADVANCED TECH RES INST CO LTD

A human-computer interaction device for emotion regulation

ActiveCN113687744BRich and varied interactive contentImprove emotional experienceCharacter and pattern recognitionMental therapiesEvaluation resultEmotion assessment
The invention discloses a human-computer interaction method and equipment for emotion regulation. The method includes the following steps: calculating the operation time of the human-computer interaction scheme according to the user's emotional evaluation results, the evaluation results of the psychological scale before the interaction and the facial emotion data; After the interaction on the same day, the objective feedback data of the user's subjective feelings and changes before and after the interaction are obtained; when the number of days of interaction reaches the operating time node of the current human-computer interaction scheme, the post-interaction psychological scale is called out for effectiveness evaluation, and the evaluation results are compared with For normative data comparison, if the evaluation result is passed, execute the next step or return to the first step; otherwise, update the operation time and operation time node of the current human-computer interaction scheme, and continue human-computer interaction until the evaluation result is passed; according to the user Subjective and objective feedback data, and update the operation time of the remaining human-computer interaction schemes until the end of this round of human-computer interaction.
Owner:BEIJING WISPIRIT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products