Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Reading understanding method based on ELMo embedding and gating self-attention mechanism

A reading comprehension and attention technology, applied in the computer field, can solve problems such as ignoring polysemy of words, not capturing long contextual information well, and not considering long text context dependencies, etc., so as to improve the accuracy of the model , the effect of improving the performance of the model

Pending Publication Date: 2021-03-30
HEFEI UNIV OF TECH
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the accuracy of some classic baseline models has room for improvement. It does not take into account the context dependence of long texts, that is, it cannot capture the associated information of long contexts well, and ignores the polysemy of words in different contexts. sexual problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reading understanding method based on ELMo embedding and gating self-attention mechanism
  • Reading understanding method based on ELMo embedding and gating self-attention mechanism
  • Reading understanding method based on ELMo embedding and gating self-attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] Such as figure 1 As shown, a reading comprehension method based on ELMo embedding and gated self-attention mechanism, the method includes the following steps:

[0054] S1: Segment and preprocess the article and the question respectively, and establish the glossary of glove words and the character list in the word that have appeared in the article and question after word segmentation;

[0055] S2: Using the pre-trained ELMo encoder, input each word to obtain its ELMo embedding representation containing contextual information;

[0056] S3: for each word, it is mapped to the corresponding word vector in the glove word vocabulary, obtains its word level representation;

[0057] S4: For each letter of the word, find the corresponding representation in its character table, and use the character vector as the input of the convolutional neural network. The output of the convolutional layer is max-pooled to obtain a fixed-length character embedding representation for each word ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a reading understanding method based on an ELMo embedding and gating self-attention mechanism. The method is based on a model involving an ELMo embedding and gating self-attention function, adopts ELMo representation, character representation of words and multi-dimensional fusion representation of glove word representation simultaneously, and introduces a self-attention layer with a gating function behind a bidirectional attention network to extract the long context information and further to filter the information. Besides, in the answer layer, the method multiplexes the feature representation obtained in each layer, and uses a bilinear function to predict the position of the final answer, thereby further improving the overall performance of the system. In an experiment on an SQuAD data set, it is proved that the model is greatly superior to many baseline models, the performance of the model is improved by about 5% compared with an original baseline of the model, the performance of the model is close to the average level of a human test, and the effectiveness of the method is fully proved.

Description

technical field [0001] The invention relates to the field of computer technology, in particular to a reading comprehension method based on ELMo embedding and gated self-attention mechanism. Background technique [0002] Machine reading comprehension has always been an important part of artificial intelligence and a research hotspot in the field of natural language processing. A large amount of human knowledge is transmitted in the form of unstructured natural language texts. Therefore, it is of great significance to enable machines to read and understand texts. It has direct application value for search engines and intelligent customer service. In recent years, machine reading comprehension has received a lot of attention in the field of natural language processing. One of the reasons for this phenomenon is the development and application of attention mechanisms, which enable models to focus on more relevant parts of the context when given a question. The Stanford SQuAD dat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/33G06F16/31G06F40/216G06F40/253G06F40/289G06F40/30G06N3/04G06N3/08
CPCG06F16/3344G06F16/316G06F16/3346G06F40/216G06F40/253G06F40/289G06F40/30G06N3/049G06N3/08G06N3/047G06N3/045
Inventor 任福继张伟伟鲍艳伟
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products