Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-computer interaction method and system based on multi-modal emotion and face attribute identification

A technology for attribute recognition and human-computer interaction, applied in the field of human-computer interaction to achieve good experience functions and improve accuracy.

Inactive Publication Date: 2018-07-06
EMOTIBOT TECH LTD
View PDF5 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Second, in the traditional human-computer interaction system, the machine must wake up the users participating in the interaction with external operations, such as the voice wake-up of customized vocabulary, rather than active interaction, and there is no real-time and active detection result through machine vision technology. Customized solutions that allow machines to actively interact with users

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method and system based on multi-modal emotion and face attribute identification
  • Human-computer interaction method and system based on multi-modal emotion and face attribute identification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Embodiments of the technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings. The following examples are only used to illustrate the technical solution of the present invention more clearly, so they are only examples, and should not be used to limit the protection scope of the present invention.

[0038] It should be noted that, unless otherwise specified, the technical terms or scientific terms used in this application shall have the usual meanings understood by those skilled in the art to which the present invention belongs.

[0039] The human-computer interaction method and system based on multi-modal emotion and facial attribute recognition provided by the embodiments of the present invention integrate natural language understanding and speech recognition systems through designing interactive platforms, such as web pages and APP programs, to target human faces. Multi-modal emotion, face attribute rec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of human-computer interaction, and provides a human-computer interaction method and system based on multi-modal emotion and face attribute identification.The method includes the steps: acquiring a face image and sound information of a user; transforming the sound information into text information; distinguishing emotions according to the face image, the sound information and the text information; determining emotion vectors; determining face attribute characteristics according to the face image; analyzing the emotion vectors, the face attribute characteristics and the text information according to replying strategies, and outputting feedback texts. According to the human-computer interaction method and system based on multi-modal emotion and face attribute identification, face attributes can be completely identified, multi-modal face emotion identification accuracy is improved, the multi-modal emotions, the face attributes and natural language interaction information are combined, and more natural and intelligent human-computer interaction experience is provided.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to a method and system for human-computer interaction based on multimodal emotion and facial attribute recognition. Background technique [0002] In the existing technology, the user's facial emotion and facial attributes, etc. that interact with the machine are detected in real time, and a guided conversation strategy that matches the user's attribute characteristics and emotional state is triggered according to the interaction principle, so that the machine can provide real-time and active information that communicates with the user. Dialogue feedback and service content that match the current state. Among them, facial attributes include facial customer physical characteristics and subjective physical characteristics. The facial customer physical characteristics can be gender, user identity, whether to wear glasses, whether there is beard or not, whether there i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/011G06F2203/011
Inventor 简仁贤许世焕卞雅雯杨闵淳
Owner EMOTIBOT TECH LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products