Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Voice interaction method and device, equipment and storage medium

A voice interaction and voice technology, which is applied in the field of human-computer interaction, can solve the problems of being unable to provide personalized response services to the user's emotional state, unable to recognize the user's emotional state, etc., and achieve the effect of improving effectiveness and accuracy

Inactive Publication Date: 2018-12-18
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
View PDF8 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Existing intelligent interaction methods can only roughly analyze the semantic information of user messages, but cannot identify the user's current emotional state. Regardless of whether the user's current emotional state is positive or negative, the machine will always feed back positive emotions to the user. Provide personalized response service based on user's emotional state

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Voice interaction method and device, equipment and storage medium
  • Voice interaction method and device, equipment and storage medium
  • Voice interaction method and device, equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] Figure 1a It is a flow chart of a voice interaction method provided by Embodiment 1 of the present invention. This embodiment is applicable to the situation of recognizing emotions in interactive voices. The method can be executed by a voice interaction device, which can be implemented by hardware and / or Or software, and can generally be integrated in computers, servers, and all terminals that include voice interaction functions. like Figure 1a As shown, the method specifically includes the following steps:

[0031] S110, when an interaction voice input by the user is detected, perform emotion recognition on the user, and determine an interaction emotion corresponding to the user according to an emotion recognition result.

[0032] Among them, the emotion can be one of the six classic psychological emotions, which are: happiness, sadness, surprise, anger, fear and disgust; or one of the 27 psychological emotions, which are: Admiration, adoration, appreciation, entert...

Embodiment 2

[0058] figure 2 It is a flowchart of a voice interaction method provided by Embodiment 2 of the present invention. As a further explanation of the above examples, as figure 2 As shown, the method includes the following steps:

[0059] S210. When an interaction voice input by the user is detected, perform emotion recognition on the user, and determine an interaction emotion corresponding to the user according to an emotion recognition result.

[0060] S220. Acquire auxiliary decision-making parameters associated with the user and / or the interactive voice.

[0061] S230. Generate an interaction feedback result matching the interaction voice according to the interaction emotion and auxiliary decision parameters, and provide the interaction feedback result to the user.

[0062] S240, if a new interactive voice input by the user for the currently provided interactive feedback result is received, acquire historical machine feedback emotions and historical machine feedback patte...

Embodiment 3

[0072] Figure 3a It is a flowchart of a voice interaction method provided by Embodiment 3 of the present invention. like Figure 3a Shown, as further explanation to above-mentioned embodiment, this method comprises the following steps:

[0073]S310. When the interactive voice input by the user is detected, based on the face image, the emotional feature information in the interactive voice, and the semantic recognition result of the interactive voice, respectively acquire confidence levels of at least two preset emotions as emotion recognition results.

[0074] S320, in the at least two types of emotion recognition results obtained by using at least two emotion recognition methods, respectively calculate the comprehensive confidence corresponding to each preset emotion according to the confidence corresponding to the same preset emotion and the preset weighting algorithm ; According to the comprehensive confidence calculation result, determine the interaction emotion corresp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a voice interaction method and device, equipment and a storage medium. The method comprises the following steps: when an interactive voice input by a user is detected, carryingout emotion recognition on the user, and determining the interactive emotion corresponding to the user according to the emotion recognition result; acquiring the auxiliary decision parameters associated with the user and / or interactive voice; according to the interactive emotion and the auxiliary decision parameters, generating an interactive feedback result matched with the interactive voice, and providing the interactive feedback result to the user. According to the voice interaction method provided by the invention, and according to the interactive emotion in the interactive voice and theauxiliary decision-making parameters associated with the user and / or the interactive voice, the interactive feedback result is determined, the accuracy of voice emotion recognition can be improved, and therefore the effectiveness of human-computer interaction is improved.

Description

technical field [0001] Embodiments of the present invention relate to the technical field of human-computer interaction, and in particular, to a voice interaction method, device, equipment, and storage medium. Background technique [0002] With the rapid development of artificial intelligence technology and the continuous improvement of corresponding interactive experience requirements, intelligent interaction has gradually begun to replace some traditional human-computer interaction methods. [0003] Existing intelligent interaction methods can only roughly analyze the semantic information of user messages, but cannot identify the user's current emotional state. Regardless of whether the user's current emotional state is positive or negative, the machine will always feed back positive emotions to the user. The user's emotional state provides personalized response services. Contents of the invention [0004] Embodiments of the present invention provide a voice interaction...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G10L15/22G10L25/63
CPCG10L15/22G10L25/63
Inventor 李士岩孙妍彦李扬张晓东赵敏齐健平葛翔王婷邹黎明李丹
Owner BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products