Supercharge Your Innovation With Domain-Expert AI Agents!

Chinese intelligent dialogue method based on Transformer

A Chinese, intelligent technology, applied in neural learning methods, text database query, unstructured text data retrieval, etc., can solve the problem of not reaching the level of intelligent understanding of semantics and context, and achieve the effect of high question and answer accuracy

Active Publication Date: 2021-04-06
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF3 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Large-scale, high-quality Chinese dialogue data plays an important role in the model. Currently known question answering systems can only answer questions mechanically, and the answers are often irrelevant, and have not reached the level of intelligent understanding of semantics and context.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Chinese intelligent dialogue method based on Transformer
  • Chinese intelligent dialogue method based on Transformer
  • Chinese intelligent dialogue method based on Transformer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0057] For the convenience of description, the relevant technical terms appearing in the specific implementation are explained first:

[0058] figure 1 It is the flow chart of the Chinese intelligent dialogue method based on Transformer of the present invention;

[0059] In this example, if figure 1 Shown, a kind of Chinese intelligent dialog method based on Transformer of the present invention comprises the following steps:

[0060] S1. Use LCCC (Large-scale Cleaned Chinese Conversation), hereinafter referred to as a large-scale Chinese chat corpus, referred to as a corpus to build a training data set;

[0061] S1.1. Construct an input sequence whose length is N=20;

[0062] Use the [CLS] tag as the start of the input sequence, then extract continuous dialogue sentences from the corpus, fill in the words in the input sequence according to the order of the sentences, insert the [SEP] tag between each sentence, and determine when each sentence is filled in Add whether the t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a Transformer-based Chinese intelligent dialogue method, which comprises the following steps of: firstly, constructing a training data set by using a large-scale Chinese chat corpus LCCC (Large scale Cleaned Chinese Conversation) to serve as input of a Transformer model, then training the Transformer model based on a deep learning mode until the Transformer model converges, and finally, inputting to-be-dialogue input sequence into the Transformer model and outputting an expected output sequence in real time through a Transformer model, thereby realizing Chinese intelligent real-time dialogue.

Description

technical field [0001] The invention belongs to the technical field of natural language processing, and more specifically relates to a Transformer-based Chinese intelligent dialogue method. Background technique [0002] With the rapid development of deep learning technology, more and more deep learning technology has been introduced in the field of NLP (Nature language model, natural language processing), and its effect has made great progress compared with traditional rule-based or traditional statistical methods. progress. The pre-trained language representation model using the Transformer model, such as BERT (Bidirectional Encoder Representations from Transformers), has achieved better results in various NLP tasks than traditional methods, because Transformer has improved the most criticized training of RNN. specialty. The self-attention mechanism is used to achieve fast parallelism, and the Transformer can be increased to a very deep depth to fully explore the characte...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/332G06F16/33G06F40/126G06F40/30G06N3/04G06N3/08
CPCG06F16/3329G06F16/3344G06F40/126G06F40/30G06N3/084G06N3/045
Inventor 杨波巩固郑文锋刘珊
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More