A method and computer for expressing deep dynamic contextual words

A dynamic context and context technology, applied in computing, natural language translation, instruments, etc., can solve problems such as high difficulty, small resource generation of word representation models, no context and dynamic concepts, etc.

Active Publication Date: 2020-05-19
成都集致生活科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] To sum up, the problem with the existing technology is that the commonly used word embedding technology has no concept of context and dynamics, and treats words as fixed atomic units, which limits the effect on many tasks.
Previously used word embedding techniques cannot be patched with improved methods
It can only re-model conceptual word representations with context and dynamics. At the same time, it is more difficult to consider that the model generates word representations in a variety of tasks, the efficiency of generating word representations is high, and the resources required by the model are small.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and computer for expressing deep dynamic contextual words
  • A method and computer for expressing deep dynamic contextual words
  • A method and computer for expressing deep dynamic contextual words

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0040] At present, the mainstream word representation technology has no concept of context and dynamics, and using fixed vectors as word representations cannot solve the problem of polysemy, which directly affects the computer's further understanding of natural language. The deep dynamic context word representation model of the present invention is a multi-layer deep neural network; Each layer of the model captures the information (grammatical information and semantic information etc.) The mechanism gives different weights to each layer of the neural network, and integrates semantic information at different levels to form a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of computer word representation, and discloses a model and method of deep dynamic context word representation. The model of deep dynamic context word representation is a masked language stacked by multi-layer bidirectional Transformer encoders with a layer attention mechanism. model; it is a multi-layer neural network, each layer of the network captures the context information of each word in the input sentence from different angles; Represents contextual representations that combine to form words. The word representation generated by this model has performed three tasks of logical reasoning (MultiNLI), named entity recognition (CoNLL2003) and reading comprehension task (SQuAD) on public datasets, which are 2.0%, 0.47% and 2.96%.

Description

technical field [0001] The invention belongs to the technical field of computer word representation, and in particular relates to a model and method of deep dynamic context word representation and a computer. Background technique [0002] Currently, the closest state-of-the-art: neural network language models. The representation of words as continuous vectors has a long history. A very popular neural network language model NNLM (Neural Network Language Model) uses a linear projection layer and a nonlinear hidden layer feed-forward neural network to jointly learn word vector representation and statistical language model. Because the model has too many parameters, although the principle is simple, it is difficult to train and apply in practice. CBOW, Skip-Gram, FastText and Glove models. CBOW, Skip-Gram, FastText and GloVe and other models, among which CBOW and Skip-Gram belong to the model under the famous word2vector framework, are trained using shallow neural network lan...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F40/42G06F40/58
CPCG06F40/42G06F40/58
Inventor 熊熙袁宵琚生根李元媛孙界平
Owner 成都集致生活科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products