Hierarchical Methods and Apparatus for Extracting User Intent from Spoken Utterances

a user intent and hierarchical technology, applied in the field of hierarchical extraction of user intent from spoken utterances, can solve the problems of human operator required, people do not talk or think in terms of specific machine-based, forget precise predetermined commands,

Inactive Publication Date: 2008-09-11
NUANCE COMM INC
View PDF17 Cites 365 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0009]Principles of the present invention provide improved techniques for permitting a user to employ more human-based grammar (i.e., free form or conversational input) while addressing a target system via a voice system.

Problems solved by technology

Unfortunately, people do not talk or think in terms of specific machine-based grammar, and may also forget the precise predetermined commands that must be uttered to effectuate their wishes.
However, a major problem with this approach is that a human operator is required.
One major difficulty in this second approach is that statistical parsers are huge in terms of storage requirements.
Further, they require hand-tuning in every step.
That is, every time data is added, the statistical parser requires a tremendous amount of hand-tuning and balancing of the new data with the old data.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hierarchical Methods and Apparatus for Extracting User Intent from Spoken Utterances
  • Hierarchical Methods and Apparatus for Extracting User Intent from Spoken Utterances
  • Hierarchical Methods and Apparatus for Extracting User Intent from Spoken Utterances

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021]While the present invention may be illustratively described below in the context of a vehicle-based voice system, it is to be understood that principles of the invention are not limited to any particular computing system environment or any particular speech recognition application. Rather, principles of the invention are more generally applicable to any computing system environment and any speech recognition application in which it would be desirable to permit the user to provide free form or conversational speech input.

[0022]Principles of the invention address the problem of extracting user intent from free form-type spoken utterances. For example, returning to the vehicle-based climate control example described above, principles of the invention permit a driver to interact with a voice system in the vehicle by giving free form voice instructions that are different than the precise (machine-based grammar) voice commands understood by the climate control system. Thus, in this ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Improved techniques are disclosed for permitting a user to employ more human-based grammar (i.e., free form or conversational input) while addressing a target system via a voice system. For example, a technique for determining intent associated with a spoken utterance of a user comprises the following steps / operations. Decoded speech uttered by the user is obtained. An intent is then extracted from the decoded speech uttered by the user. The intent is extracted in an iterative manner such that a first class is determined after a first iteration and a sub-class of the first class is determined after a second iteration. The first class and the sub-class of the first class are hierarchically indicative of the intent of the user, e.g., a target and data that may be associated with the target. The multi-stage intent extraction approach may have more than two iterations. By way of example only, the user intent extracting step may further determine a sub-class of the sub-class of the first class after a third iteration, such that the first class, the sub-class of the first class, and the sub-class of the sub-class of the first class are hierarchically indicative of the intent of the user.

Description

CROSS-REFERENCE TO RELATED APPLICATION(S)[0001]This application is a continuation of U.S. application Ser. No. 11 / 216,483 filed on Aug. 31, 2005, the disclosure of which is incorporated herein by reference.FIELD OF INVENTION[0002]The present invention relates generally to speech processing systems and, more particularly, to systems for hierarchically extracting user intent from spoken utterances, such as spoken instructions or commands.BACKGROUND OF THE INVENTION[0003]The use of a speech recognition system (or a voice system) to translate a user's spoken command to a precise text command that the target system can input and process is well known. For example, in a conventional voice system based in a vehicle, a user (e.g., driver) interacts with the voice system by uttering very specific commands that must be consistent with machine-based grammar that is understood by the target system.[0004]By way of example, assume that the climate control system in the vehicle is the target syste...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): G10L21/00
CPCG10L15/1815G10L15/1822G10L2015/226
Inventor KANEVSKY, DIMITRIREISINGER, JOSEPH SIMONSICCONI, ROBERTVISWANATHAN, MAHESH
Owner NUANCE COMM INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products