Supercharge Your Innovation With Domain-Expert AI Agents!

Chinese Intelligent Dialogue Method Based on Transformer

A Chinese and intelligent technology, applied in neural learning methods, text database query, unstructured text data retrieval, etc., can solve problems that do not reach the level of intelligent understanding of semantics and context, and achieve real-time question-and-answer scenarios, high question-and-answer accuracy The effect of high efficiency and broad application prospects

Active Publication Date: 2022-03-25
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Large-scale, high-quality Chinese dialogue data plays an important role in the model. Currently known question answering systems can only answer questions mechanically, and the answers are often irrelevant, and have not reached the level of intelligent understanding of semantics and context.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Chinese Intelligent Dialogue Method Based on Transformer
  • Chinese Intelligent Dialogue Method Based on Transformer
  • Chinese Intelligent Dialogue Method Based on Transformer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0057] For the convenience of description, the relevant technical terms appearing in the specific implementation are explained first:

[0058] figure 1 It is the flow chart of the Chinese intelligent dialogue method based on Transformer of the present invention;

[0059] In this example, if figure 1 Shown, a kind of Chinese intelligent dialog method based on Transformer of the present invention comprises the following steps:

[0060] S1. Use LCCC (Large-scale Cleaned Chinese Conversation), hereinafter referred to as a large-scale Chinese chat corpus, referred to as a corpus to build a training data set;

[0061] S1.1. Construct an input sequence whose length is N=20;

[0062] Use the [CLS] tag as the start of the input sequence, then extract continuous dialogue sentences from the corpus, fill in the words in the input sequence according to the order of the sentences, insert the [SEP] tag between each sentence, and determine when each sentence is filled in Add whether the t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a Transformer-based Chinese intelligent dialogue method. First, a large-scale Chinese chat corpus LCCC (Large-scale Cleaned Chinese Conversation) is used to construct a training data set as the input of the Transformer model, and then the Transformer is trained based on deep learning. The model is trained until the Transformer model converges. Finally, the input sequence to be dialogued is input into the Transformer model, and the expected output sequence is output in real time through the Transformer model, thereby realizing intelligent real-time dialogue in Chinese.

Description

technical field [0001] The invention belongs to the technical field of natural language processing, and more specifically relates to a Transformer-based Chinese intelligent dialogue method. Background technique [0002] With the rapid development of deep learning technology, more and more deep learning technology has been introduced in the field of NLP (Nature language model, natural language processing), and its effect has made great progress compared with traditional rule-based or traditional statistical methods. progress. The pre-trained language representation model using the Transformer model, such as BERT (Bidirectional Encoder Representations from Transformers), has achieved better results in various NLP tasks than traditional methods, because Transformer has improved the most criticized training of RNN. specialty. The self-attention mechanism is used to achieve fast parallelism, and the Transformer can be increased to a very deep depth to fully explore the characte...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/332G06F16/33G06F40/126G06F40/30G06N3/04G06N3/08
CPCG06F16/3329G06F16/3344G06F40/126G06F40/30G06N3/084G06N3/045
Inventor 杨波巩固郑文锋刘珊
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More