Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep dynamic context word representation model and method and computer

A dynamic context and context technology, applied in computing, special data processing applications, instruments, etc., can solve problems such as difficulty, generation of word representation models with small resources, no concept of context and dynamics, etc.

Active Publication Date: 2019-09-10
成都集致生活科技有限公司
View PDF6 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] To sum up, the problem with the existing technology is that the commonly used word embedding technology has no concept of context and dynamics, and treats words as fixed atomic units, which limits the effect on many tasks.
Previously used word embedding techniques cannot be patched with improved methods
It can only re-model conceptual word representations with context and dynamics. At the same time, it is more difficult to consider that the model generates word representations in a variety of tasks, the efficiency of generating word representations is high, and the resources required by the model are small.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep dynamic context word representation model and method and computer
  • Deep dynamic context word representation model and method and computer
  • Deep dynamic context word representation model and method and computer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0041] At present, the mainstream word representation technology has no concept of context and dynamics, and using fixed vectors as word representations cannot solve the problem of polysemy, which directly affects the computer's further understanding of natural language. The deep dynamic context word representation model of the present invention is a multi-layer deep neural network; Each layer of the model captures the information (grammatical information and semantic information etc.) The mechanism gives different weights to each layer of the neural network, and integrates semantic information at different levels to form a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computer word representation, and discloses a deep dynamic context word representation model and method. The deep dynamic context word representation model is a masking language model formed by stacking multi-layer bidirectional Transformer encoders with layers of attention mechanisms. The network is a multi-layer neural network, and each layer of the network captures context information of each word in the input statement from different angles. Then different weights are given to each layer of the network through a layer attention mechanism. Finally, different word representations are synthesized according to the weights to form context representations of the words. Words generated by using the model represent three tasks of logic reasoning(MultiNLI), named entity recognition (CoNLL2003) and reading understanding (SQuAD) on a public data set, and the tasks are improved by 2.0%, 0.47% and 2.96% respectively compared with an existing model.

Description

technical field [0001] The invention belongs to the technical field of computer word representation, and in particular relates to a model and method of deep dynamic context word representation and a computer. Background technique [0002] Currently, the closest state-of-the-art: neural network language models. The representation of words as continuous vectors has a long history. A very popular neural network language model NNLM (Neural Network Language Model) uses a linear projection layer and a nonlinear hidden layer feed-forward neural network to jointly learn word vector representation and statistical language model. Because the model has too many parameters, although the principle is simple, it is difficult to train and apply in practice. CBOW, Skip-Gram, FastText and Glove models. CBOW, Skip-Gram, FastText and GloVe and other models, among which CBOW and Skip-Gram belong to the model under the famous word2vector framework, are trained using shallow neural network lan...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/28
CPCG06F40/42G06F40/58
Inventor 熊熙袁宵琚生根李元媛孙界平
Owner 成都集致生活科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products