Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

728 results about "Emotion identification" patented technology

Extraction and modeling method for Chinese speech sensibility information

The invention provides a method for extracting and modeling the emotional information of a Chinese sound; the extracting method for the emotional information of the Chinese sound is that: formulate the specification of a emotional speech database, which includes the pronouncer specification, the recording play book design specification and the naming specification of audio files and so on; collect the emotional speech data; evaluate the validity of the emotional speech, namely, at least ten evaluators apart from a speaker carry out a subjective listen evaluation experiment on the emotional speech data. The modeling method of the emotional information of the Chinese sound is that: extract the emotional characteristics of the sound, define and distinguish the characteristic combination of each emotion type; adopt different characteristic combinations to train the SVM model of a multilevel sound emotion recognition system; verify the identification effect of the classifying models, namely, verify the classification effect of the multilevel classification models of sound emotion in a situation unrelated to the speaker by adopting a cross leave-one-out method. The method solves the problems that the domestic emotional speech databases are less in emotion type and the number of the domestic emotional speech database is very limited; at the same time, the method realizes an efficient speech emotion identification system.
Owner:BEIHANG UNIV

Man-machine interaction method and system for online education based on artificial intelligence

PendingCN107958433ASolve the problem of poor learning effectData processing applicationsSpeech recognitionPersonalizationOnline learning
The invention discloses a man-machine interaction method and system for online education based on artificial intelligence, and relates to the digitalized visual and acoustic technology in the field ofelectronic information. The system comprises a subsystem which can recognize the emotion of an audience and an intelligent session subsystem. Particularly, the two subsystems are combined with an online education system, thereby achieving the better presentation of the personalized teaching contents for the audience. The system starts from the improvement of the man-machine interaction vividnessof the online education. The emotion recognition subsystem judges the learning state of a user through the expression of the user when the user watches a video, and then the intelligent session subsystem carries out the machine Q&A interaction. The emotion recognition subsystem finally classifies the emotions of the audiences into seven types: angry, aversion, fear, sadness, surprise, neutrality,and happiness. The intelligent session subsystem will adjust the corresponding course content according to different emotions, and carry out the machine Q&A interaction, thereby achieving a purpose ofenabling the teacher-student interaction and feedback in the conventional class to be presented in an online mode, and enabling the online class to be more personalized.
Owner:JILIN UNIV

Man-machine interaction method and device based on emotion system, and man-machine interaction system

The invention discloses a man-machine interaction method and device based on an emotion system, and a man-machine interaction system. The method comprises following steps of collecting voice emotion parameters, expression emotion parameters and body emotion parameters; calculating to obtain a to-be-determined voice emotion according to the voice emotion parameters; selecting a voice emotion most proximate to the to-be-determined voice emotion from preset voice emotions as a voice emotion component; calculating to obtain a to-be-determined expression emotion according to the expression emotion parameters; selecting an expression emotion most proximate to the to-be-determined expression emotion from preset expression emotions as an expression emotion component; calculating to obtain a to-be-determined body emotion according to the body emotion parameters; selecting a body emotion most proximate to the to-be-determined body emotion from preset body emotions as a body emotion component; fusing the voice emotion component, the expression emotion component and the body emotion component, thus determining an emotion identification result; and outputting multi-mode feedback information specific to the emotion identification result. According to the method, the device and the system, the man-machine interaction process is more smooth and natural.
Owner:BEIJING GUANGNIAN WUXIAN SCI & TECH

Serial-parallel combined multi-mode emotion information fusion and identification method

The present invention discloses a serial-parallel combined multi-mode emotion information fusion and identification method belonging to the emotion identification technology field. The method mainly comprises obtaining an emotion signal; pre-processing the emotion signal; extracting an emotion characteristic parameter; and fusing and identifying the characteristic parameter. According to the present invention, firstly, the extracted voice signal and facial expression signal characteristic parameters are fused to obtain a serial characteristic vector set, then M parallel training sample sets are obtained by the sampling with putback, and sub-classifiers are obtained by the Adabost algorithm training, and then difference of every two classifiers is measured by a dual error difference selection strategy, and finally, vote is carried out by utilizing the majority vote principle, thereby obtaining a final identification result, and identifying the five human basic emotions of pleasure, anger, surprise, sadness and fear. The method completely gives play to the advantage of the decision-making level fusion and the characteristic level fusion, and enables the fusion process of the whole emotion information to be closer to the human emotion identification, thereby improving the emotion identification accuracy.
Owner:BOHAI UNIV

Nervous emotion intensity identification system and information processing method based on multiple physiological parameters

The invention relates to the field of emotion recognition, in particular to a nervous emotion intensity identification system and an information processing method based on multiple physiological parameters. The information processing method includes the steps of offline training and online monitoring, wherein the offline training includes the processes of inducing users' nervous emotions, collecting users' multiple physiological signals and carrying out signal processing; the signal processing includes the processes of preprocessing, feature extraction and pattern recognition; the preprocessing includes the processes of suppressing power frequency interference for EEG signals by an adaptive filter, removing the power frequency interference by a band pass filter after amplifying the ECG, respiration and skin electrical signals, and intercepting valid data by an information processing tool. The nervous emotion intensity identification system and the information processing method have the advantages of collecting central nervous signals and autonomic nerve signals that reflect the nervous system information of the human body, establishing a classification model for people or individuals by offline training to carry out the real-time recognition and detection on the intensity of the users' nervous emotions, warning for the nervous emotions of excessive intensity, storing the users' emotional physiological signals in the whole process, and detecting the intensity of the users' nervous emotions in real time.
Owner:TIANJIN UNIV

Human natural state emotion identification method based on double-mode combination of expression and behavior

The invention relates to a human natural state emotion identification method based on double-mode combination of expression and behavior. The human natural state emotion identification method comprises the following steps of S1, establishing an emotion cognition architecture in two classification modes; S2, performing human body area detection on a natural-gesture human body image of video input; S3, performing characteristic point extraction on the image in a human body trunk subarea, obtaining a characteristic point motion track according to the characteristic points in image frames at different time, acquiring a main motion track which reflects human body behavior from the characteristic point motion track by means of a clustering method, extracting a human body trunk motion characteristic from the main motion track; S4, acquiring an emotion cognition coarse classification result according to the human body trunk motion characteristic; S5, performing face expression characteristic extraction on the image of the face sub-area; and S6, outputting an emotion cognition fine-classification result which corresponds with the searched face expression characteristic. Compared with the prior art, the human natural state emotion identification method has advantages of high identification precision, wide application range, easy realization, etc.
Owner:SHANGHAI UNIVERSITY OF ELECTRIC POWER

Emotion recognition system and method based on wearable bracelet

The invention provides an emotion recognition system and method based on a wearable bracelet. The system comprises a physiological signal acquisition module, a physiological signal preprocessing module, a physiological signal feature extraction module, an emotion classification module and a comprehensive evaluation module; the physiological signal acquisition module is used for acquiring three kinds of physiological data of electrocardiograph, heart rate and skin electricity of a wearer; the physiological signal preprocessing module is used for carrying out data segmentation and denoising on the three kinds of physiological data and then transmitting the physiological data to the physiological signal feature extraction module; the physiological signal feature extraction module is used forcarrying out feature extraction on the three kinds of physiological data; the emotion classification module is used for carrying out emotion recognition on the three kinds of physiological data and outputting three emotion states; the comprehensive evaluation module adopts a weight-based voting decision rule, voting decision is carried out on the three emotional states, a current emotional state label of the wearable bracelet wearer is determined comprehensively, and a recognition result is obtained. According to the system, the cognitive and management capability of the wearer on the emotionof the wearer is improved, and a healthier psychological state can be possessed.
Owner:SOUTH CHINA UNIV OF TECH

Dual-mode emotion identification method and system based on facial expression and eyeball movement

The present invention relates to a dual-mode emotion identification method and system based on a facial expression and an eyeball movement. The method comprises a step of performing acquisition, a step of extracting a facial expression feature vector, a step of extracting an eyeball movement feature vector, a step of performing qualitative analysis on an emotional state, a step of performing matching-by-time and storage, a step of performing fusion and classification and a step of comparing emotional information. According to the method and system provided by the present invention, facial expression information of a to-be-tested object can be dynamically and accurately extracted and analyzed, and a correlation between the facial expression and the emotion is established; rich eye movement information can be accurately and efficiently acquired by means of tracking of an eye tracker, and the emotional state of the to-be-tested object is analyzed from the angle of the eyeball movement; and the facial expression feature vector and the eyeball movement feature vector are processed by using an SVR, so that the emotional state of the to-be-tested object can be obtained more accurately, and thus accuracy and reliability of emotion identification are improved.
Owner:CHINA UNIV OF GEOSCIENCES (WUHAN)

Telemarketing customer service system based on artificial intelligence and business operation mode

InactiveCN107330706ARealize precision marketingRealize intelligent dispatchBuying/selling/leasing transactionsSpeech recognitionPrecision marketingIntelligence analysis
The invention provides a telemarketing customer service system based on artificial intelligence and a business operation mode. According to records of various customer services and historical clients in a platform, a character model of every customer service and a character model of every historical client are generated based on emotion identification system. By adopting an artificial intelligence analysis technology, multidimensional analysis of every customer service, every task, and every client is carried out to generate the intelligent portrait of every customer service, the intelligent portrait of every task, and the intelligent portrait of every client are generated. The intelligent matching of the intelligent portrait of every customer service, the intelligent portrait of every task, and the intelligent portrait of every client is carried out according to a preset rule, and then a plurality of orders are generated, and accurate marketing among the customer services, the tasks, and the clients is realized, and therefore personalized service is provided for the clients. After the orders are finished, the intelligent vouching of the finished orders by adopting a speech recognition system and a preset vouching strategy, real-time quality inspection and real-time billing are realized, and finally, work load of workers is effectively reduced, vouching time is shortened, and the accuracy of the quality inspection is improved.
Owner:SHANGHAI HANGDONG TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products