Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Novel multi-round dialogue natural language understanding model based on BERTCONTEXT

A natural language understanding and model technology, applied in biological neural network models, electrical digital data processing, special data processing applications, etc., can solve the problems of poor NLU task effect, no design of multiple rounds of dialogue NLU, and unequal semantic information. Improved accuracy, reduced risk of overfitting, and improved experience

Inactive Publication Date: 2021-04-16
SUN YAT SEN UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The NLU task studied by this model is aimed at the current sentence. There is no NLU designed for multi-round dialogue, but at present, more scenarios of man-machine dialogue are in multi-round dialogue, and multi-round dialogue needs to consider the historical information of the dialogue.
We know that when humans have multiple rounds of conversations, they will omit or use demonstrative pronouns to replace specific information in the current conversation based on previous chat information. If NLU only uses current sentence information, it will lead to learned semantic information. Do not
[0003] Complete or even wrong, then the effect of each NLU task will be very poor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Novel multi-round dialogue natural language understanding model based on BERTCONTEXT
  • Novel multi-round dialogue natural language understanding model based on BERTCONTEXT
  • Novel multi-round dialogue natural language understanding model based on BERTCONTEXT

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0028] figure 2 It is a structural schematic diagram of multi-task based on parameter sharing in an embodiment of the present invention. When multiple tasks are learned together, for one of the tasks in the learning process, other irrelevant tasks are equivalent to noise, and adding noise is It can improve the generalization ability of the model. When performing different classification tasks, if multiple classification tasks are regarded as independent problems, each task must train a model to obtain results, and then add these results together, which is not only computationally time-consuming, but also Split the relationship between the various tasks. Multi-task learning shares the underlying hidden layer parameters across all tasks while reserving a few task-specific output layers to achieve multi-task. Multi-task learning involves parallel learning of multiple related tasks at the same time. The sharing of underlying parameters can help learning improve the generalizati...

Embodiment

[0069] The present invention has carried out accuracy comparison and analysis experiments on the above-mentioned improved model and the MT-DNN model on the open Chinese data set than the original MT-DNN network, specifically as follows:

[0070] The CrossWOZ data set is a Chinese large-scale multi-domain task-oriented dialogue data set, including three parts: training set, verification set and test set. The data set contains 6k dialogues and 102K sentences, involving 5 fields. The training set consists of 5012 dialogues, the validation set consists of 500 dialogues, and the test set has 500 dialogues. Each dialogue includes specific sentence text information, domain information, and intent-filled slot information, which can be used for multi-task natural language understanding.

[0071] In the experiment, the deep learning framework used is tensorflow. The new BERTCONTEXT model uses the word vectors trained by the BERT language model. In the improved BERTCONTEXT model, the BE...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

How to realize accurate natural language understanding (NLU) by a chat robot and an intelligent customer service is a very important part in man-machine conversation, and is also a hot spot for research in recent years. Natural language understanding of multiple rounds of dialogues not only needs to pay attention to semantic information of the current dialogue, but also needs to pay attention to historical dialogue information during the dialogue. An MT-DNN model for carrying out natural language understanding on multiple tasks is representative, the model mainly considers a general NLU task, but natural language understanding under multiple rounds of conversations is not involved, if the NLU task is carried out on the multiple rounds of conversations according to a current statement, semantic information is lacked, and semantic information is lost and incomplete. According to the invention, a novel BERTCONTEXT-based natural language understanding model for multiple rounds of dialogues is designed. Historical dialogue information is integrated into current dialogue information, and the accuracy of the novel BERTXONTEXT model is improved by 1.4% compared with that of an MT-DNN model under the multi-round dialogue effect.

Description

technical field [0001] The present invention relates to the field of natural language processing, that is, a natural language understanding model for multi-round dialogues, based on a multi-task algorithm of BERT. Background technique [0002] In recent years, with the popularity of artificial intelligence, chat robots and intelligent customer service have also been widely used. Among them, how to obtain accurate semantic information, that is, how to achieve accurate natural language understanding (NLU) by artificial intelligence, is a key issue in human-machine dialogue. A very important part, and also a hot spot of research in recent years. NLU involves multiple tasks such as: sentiment analysis, intent recognition, domain recognition, named entity recognition, etc. Now the more popular natural language understanding model is NLU that uses BERT to do various tasks. The representative model is the multi-task neural network (MT-DNN) model. MT-DNN uses a large number of mult...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/332G06F16/33G06N3/04G06N3/08G06F16/35
Inventor 戴宪华其他发明人请求不公开姓名
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products