Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Conversation control device and conversation control method

A technology of dialogue control and words, applied in speech analysis, instrumentation, electrical digital data processing, etc., can solve problems such as inability to correctly estimate user intentions, manual inspection, etc.

Inactive Publication Date: 2017-08-18
MITSUBISHI ELECTRIC CORP
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] However, in the technique of the above-mentioned patent document 1, manual checking is required to update the thesaurus dictionary, and it is not easy to cover all the words. estimate the user's intention

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Conversation control device and conversation control method
  • Conversation control device and conversation control method
  • Conversation control device and conversation control method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0038] figure 1 It is a block diagram showing the configuration of the dialog control device 100 according to the first embodiment.

[0039] The dialog control device 100 according to Embodiment 1 includes a speech input unit 101, a speech recognition dictionary storage unit 102, a speech recognition unit 103, a morpheme analysis dictionary storage unit 104, a morpheme analysis unit (text analysis unit) 105, an intention estimation model storage unit 106, Intention estimation processing unit 107 , unknown word extraction unit 108 , dialogue script data storage unit 109 , response sentence generation unit 110 , speech synthesis unit 111 , and speech output unit 112 .

[0040] Hereinafter, a case where the dialog control device 100 is applied to a car navigation system will be described as an example, but the application object is not limited to the navigation system, and can be appropriately changed. In addition, a case where the user communicates with the dialogue control dev...

Embodiment approach 2

[0085] In this second embodiment, a configuration is shown in which a syntactic analysis is further performed on the morphological analysis result, and unknown word extraction is performed using the result of the syntactic analysis.

[0086] Figure 9 It is a block diagram showing the configuration of an interactive control device 100a according to Embodiment 2.

[0087] In Embodiment 2, the unknown word extraction unit 108a further includes a syntax analysis unit 113, and the intention estimation model storage unit 106a stores a frequent word list in addition to the intention estimation model. In the following, the same reference numerals as those used in Embodiment 1 are assigned to the same or equivalent components as those of the dialogue control device 100 according to Embodiment 1, and descriptions thereof are omitted or simplified.

[0088] The syntax analysis unit 113 also performs syntax analysis on the morphological analysis result analyzed by the morphological anal...

Embodiment approach 3

[0116] In this third embodiment, a configuration is shown in which a known word extraction process opposite to the unknown word extraction process of the above-mentioned first and second embodiments is performed using the morphological analysis result.

[0117] Figure 15 It is a block diagram showing the configuration of the dialog control device 100b according to the third embodiment.

[0118] In Embodiment 3, a known word extraction unit 114 is provided instead of figure 1 The unknown word extraction unit 108 of the dialog control device 100 according to Embodiment 1 is shown. In the following, the same reference numerals as those used in Embodiment 1 are assigned to the same or equivalent components as those of the dialogue control device 100 according to Embodiment 1, and descriptions thereof are omitted or simplified.

[0119] The known word extraction unit 114 extracts parts of speech extracted by the morphological analysis unit 105 that are not stored in the intentio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention is provided with: a morpheme analysis unit 105 for analysis of text inputted by a user in natural language; an intent-inference processing unit 107 that, making reference to an intent inference model in which words and user intent inferred from the words are stored in associated form, infers the intent of the user from the result of the text analysis by the morpheme analysis unit 105; an unknown term extraction unit 108 that, in the event that the intent of the user cannot be uniquely identified by the intent-inference processing unit 107, extracts from the text analysis results an unknown term that is a word not stored in the intent inference model; and a response sentence generation unit 110 for generating a response sentence that includes the unknown term extracted by the unknown term extraction unit 108.

Description

technical field [0001] The present invention relates to a dialogue control device and a dialogue control method, which recognize, for example, a user's speech input and text input through a keyboard, estimate the user's intention based on the recognized result, and conduct a dialogue for performing the user's desired operation. Background technique [0002] In recent years, a voice recognition device is used for operating equipment. The voice recognition device takes, for example, a voice spoken by a person as an input, and executes an operation using a recognition result of the input voice. Conventionally, in this speech recognition device, a presumed speech recognition result is associated with an operation by a system in advance, and an operation is performed when the speech recognition result matches the presumed speech recognition result. Therefore, the user needs to keep in mind how the system behaves when it waits in order to perform an operation. [0003] As a techn...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G10L15/22G10L13/00
CPCG10L15/26G06F40/247G06F40/35G06F40/211G06F40/268G06F40/284
Inventor 小路悠介藤井洋一石井纯
Owner MITSUBISHI ELECTRIC CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products