Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A task type dialogue system based on deep network learning

A dialogue system and deep network technology, applied in the field of recommendation system, can solve the problems of unable to capture word position information, unable to recognize words, etc.

Inactive Publication Date: 2019-05-07
SUN YAT SEN UNIV
View PDF3 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problem that the word bag and average word vector in the existing mixed-code network recommendation system cannot capture the position information of the words in the sentence and cannot recognize the words that are not entered in the system, the present invention proposes a task-based dialogue system based on deep network learning. The technical scheme adopted in the present invention is:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A task type dialogue system based on deep network learning
  • A task type dialogue system based on deep network learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] Figure 1~2 As shown, a task-based dialogue system based on deep network learning includes a language processing module, a recurrent neural network, a classification layer module, an answer generation module, a domain knowledge module and a neural network selection layer module, and the language processing module includes a word - Character-level recurrent networks, domain knowledge modules including answer templates;

[0042] The workflow of the system is as follows:

[0043]S10. In the language understanding process, use the word-character level recurrent network to encode the user's input Q t , and the final answer A of the dialogue system t , to obtain the sentence vector O corresponding to the user's latest input and the system's final answer respectively q (t) with O a (t); After encoding, the sentence vectors corresponding to the user's latest input and the system's final answer are vector spliced, and they are used as the input of the cyclic neural network: ...

Embodiment 2

[0067] This embodiment is a specific embodiment of the word-character level recurrent network, such as figure 2 As shown, when encountering some words that are not in word2vec, such as proper nouns, or typos entered by the user. At this time, word2vec cannot be used to obtain the corresponding word vector. To address these issues, we introduce a character-level recurrent neural network to encode unseen words.

[0068] A detailed illustration of this technique can be seen in the figure. For example, the encoding of the sentence "for vietnamese food?", for and food can be found directly in word2vec, and the word vector is taken out; but vietnamese is a word that has never been seen before, so it is converted character by character into the corresponding unique The hot code is input into the character-level cyclic neural network, and then the last hidden layer of the cyclic neural network is taken as the representation of the word vector.

Embodiment 3

[0070] In this embodiment, it will be described how to combine the code and the neural network, select the neural network, and how to introduce domain knowledge into the layered mixed code network. The application of domain knowledge is mainly in the following aspects: key entity recognition, behavior template summary and database object recommendation.

[0071] Taking the data set in Table 1 as an example, there are four entities in this task: dishes, unregistered dishes, locations, and prices. We use a simple string match to find the corresponding entity from the user input, and then we can use a binary vector to represent the presence or absence of the entity. Then, we summarize the templates that the dialogue system uses to answer from the training set. For example, "pipasha restaurant is anice place in the east of town and the prices are expensive" can be abstracted as " is a nice place in the of town and the prices are ". In this way, we have summed up a total of 77 t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

According to the task type dialogue system based on deep network learning provided by the invention, two new processing flows are put forward: selection of a word-character-level cyclic network and aneural network. The hierarchical hybrid code network exceeds other models and achieves the most advanced performance regardless of the correct rate of each sentence of conversation or the correct rateof each complete conversation. The hierarchical hybrid code network can surpass the ability of the hybrid code network to mainly return to its coding sentence vector and identify the superiority of unregistered words. The word bag and average word vector of the mixed code network lack word order information, and this is exactly what the word-character-level loop network can do..

Description

technical field [0001] The present invention relates to the field of recommendation systems, and more specifically, to a task-based dialogue system based on deep network learning. Background technique [0002] Task-oriented dialogue system is a brand-new way of human-computer interaction, which has been widely concerned by industry and academia since its birth. Unlike open-conversation bots, such systems are task-centric, guiding users and providing necessary information rather than ordinary small talk. The traditional approach is to decompose the dialogue task into pipeline work: natural language understanding, dialogue state tracking, dialogue policy learning and natural language generation, each module works independently. This type of model is not only poor in portability, but also needs to provide annotation data for different modules, which is time-consuming and laborious. The more popular research now is to treat the dialogue system as a whole, train the model direc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/332G06N3/08G06N3/04
Inventor 杨猛梁伟日
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products