Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Intelligent voice interaction method and device

A technology of intelligent voice and interactive methods, which is applied in voice analysis, voice recognition, instruments, etc., and can solve problems such as large effects, errors, and difficulty in ensuring high accuracy.

Active Publication Date: 2020-11-17
IFLYTEK CO LTD
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Voice interaction is one of the mainstream interaction methods in smart device applications. Its advantages of convenience and speed are obvious to all. However, its interaction effect is greatly affected by the environment. Although there are many mature technical solutions to improve the effect of voice interaction, but for In a more complex interactive environment, its accuracy is still difficult to get a high guarantee
For example, in the vehicle environment, it is affected by the following factors: the user is generally a certain distance away from the vehicle microphone, there are various noises in the vehicle environment (such as tire noise, air-conditioning noise, external noise, etc.), and the user’s expressions are diverse. The interactive voice may make a wrong understanding, which will cause the car to make a wrong response and bring a poor user experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Intelligent voice interaction method and device
  • Intelligent voice interaction method and device
  • Intelligent voice interaction method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0091] In order to enable those skilled in the art to better understand the solutions of the embodiments of the present invention, the embodiments of the present invention will be further described in detail below in conjunction with the drawings and implementations.

[0092] The existing intelligent interaction methods in the vehicle environment generally perform semantic understanding for this round of interaction when performing semantic understanding. However, in some human-computer interaction environments, there will be some human-computer interaction voices, and these human-computer interaction voices usually contain information related to the content of human-computer interaction, such as users in the car talking to other passengers, or In the process of making calls with others, most of the information related to the car business is implied. This information is of great help in improving the understanding of intentions in human-computer interaction. To this end, the e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an intelligent voice interaction method and device. The method comprises the following steps: acquiring human-machine interaction voice data, and performing semantic comprehension on the human-machine interaction voice data so as to obtain the current semantic comprehension result; judging whether the current semantic comprehension result is trusted, making response to the trusted semantic comprehension result, correcting the current semantic comprehension result based on the stored related information of human-human interaction data when the semantic comprehension result is not trusted so as to acquire the corrected semantic comprehension result, and making response to the corrected semantic comprehension result. Since only the human-machine interaction information is utilized in the conventional intelligent interaction method in a vehicle-mounted environment, while users in the vehicle chat with other passengers, information related to vehicle machine services is hidden in the process of calling other persons and the like. The semantic comprehension in human-machine interaction is corrected by adopting the related information of the human-human interaction data, the accuracy of the human-machine interaction semantic comprehension is improved, and user experience is improved.

Description

technical field [0001] The invention relates to the field of voice signal processing, in particular to an intelligent voice interaction method and device. Background technique [0002] With the increasing maturity of artificial intelligence-related technologies, people's lives have begun to become intelligent, and various smart devices have gradually entered people's daily lives, such as smart cars. Voice interaction is one of the mainstream interaction methods in smart device applications. Its advantages of convenience and speed are obvious to all. However, its interaction effect is greatly affected by the environment. Although there are many mature technical solutions to improve the effect of voice interaction, but for In a more complex interactive environment, its accuracy is still difficult to get a high guarantee. For example, in the vehicle environment, it is affected by the following factors: the user is generally a certain distance away from the vehicle microphone, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G10L15/18G10L15/26G10L17/22
CPCG10L15/1815G10L15/26G10L17/22
Inventor 李深安马军涛王兴宝庄纪军王雪初孔祥星韩后岳
Owner IFLYTEK CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products