Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural machine translation incorporating dependencies

A technology of dependency and machine translation, applied in natural language translation, instruments, computing, etc., can solve the problems of not considering linguistic information, not considering the correlation of the source hidden layer, etc., to achieve the effect of improving translation quality

Active Publication Date: 2018-12-21
SUZHOU UNIV
View PDF1 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0030] Transformer model: the source end uses a self-attention mechanism, does not consider the correlation between the hidden layers of the source end, and does not consider linguistic information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural machine translation incorporating dependencies
  • Neural machine translation incorporating dependencies
  • Neural machine translation incorporating dependencies

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0054] The neural machine translation method incorporating dependencies in this embodiment, such as figure 1 Shown is a dependency tree parsed by stanfordparser, where the arrow points to the child node, and the arrow starts to be the parent node. in figure 1 "Eating" is more related to "Like" and "Apple". The present invention guides the source end: at the source end, a dependency correlation loss is added to guide the correlation between hidden layers. The source end of the network constitutes a guidance loss, which is used to guide the neural machine translation NMT.

[0055] For a sentence pair (X, Y), the overall loss of the proposed network is defined as follows:

[0056] loss=-logP(Y|X)+Δ dep

[0057] Where -logP(Y|X) is the cross entropy loss, Δ dep It is the loss of dependency and correlation. Through this guidance loss, neural machine translation NMT can guide the relationship between the source-side hidden layers.

[0058] As the commonly used neural machine translat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a neural machine translation method incorporating dependency relation, which is designed for obtaining more accurate neural translation model. The invention integrates the dependency relation neural machine translation method, analyzes the dependency tree of the source sentence, and determines the relevance information between the source sentence words and the words. Basedon the dependency relation information, the dependency relevance loss Delta dep is determined, and the overall loss of the sentence pair network is obtained. The invention adds a self-attention mechanism at the source end, and integrates the self-attention mechanism into dependency guidance.

Description

Technical field [0001] The invention belongs to the technical field of machine learning, and specifically relates to a neural machine translation method incorporating dependency relationships. Background technique [0002] Machine translation refers to the technology of automatically converting one language (Source Language) into another language (Target Language) with the help of a computer. [Bahdanau et al., 2015] proposed to introduce the attention mechanism into neural machine translation, so that the effect of Neural Machine Translation (NMT) is gradually improved and gradually replaced Statistic Machine Translation (SMT). In 2017 [Vaswaniet al., 2017] proposed the Transformer model. The model fully uses the attention mechanism. The integration of multi-layer and residual networks has greatly improved the performance of neural machine translation. Researchers have improved the performance of the translation system based on the two models. , Large Internet companies are grad...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/28
CPCG06F40/58
Inventor 段湘煜王坤张民
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products