Text-level neural machine translation method based on context memory network

A machine translation and context technology, applied in the field of neural machine translation, can solve the problems of low computational efficiency and insufficient context information, and achieve the effect of efficient structure

Pending Publication Date: 2020-05-15
沈阳雅译网络技术有限公司
View PDF5 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Aiming at the shortcomings of the multi-encoder method in text-level neural machine translation, such as low computational efficiency and insufficient context information, the present invention proposes a text-level machine translation method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text-level neural machine translation method based on context memory network
  • Text-level neural machine translation method based on context memory network
  • Text-level neural machine translation method based on context memory network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be further elaborated below in conjunction with the accompanying drawings of the description.

[0040] The present invention is a text-level neural machine translation method based on a contextual memory network, specifically comprising the following steps:

[0041] 1) Adopt the Transformer model based on the self-attention mechanism, add a context memory module on the encoder side to dynamically maintain the context memory, and form a Transformer model based on the context memory network;

[0042] 2) Construct a parallel corpus, segment the source language and the target sentence, and convert the obtained corresponding word sequence into a corresponding word vector representation;

[0043] 3) On the encoder side, layer-by-layer feature extraction is performed on the word embedding of the source language input, and the corresponding context information is introduced through the context memory module, which is fused into the current coded repre...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a text-level neural machine translation method based on a context memory network. The method comprises the following steps: constructing a Transformer model of the context memory network; constructing a parallel corpus to obtain word embedding input of the model; at an encoder end, enabling source language input to be subjected to encoding representation in combination witha context memory module, and updating the current encoding representation into the context memory module; at a decoder end, processing the target language in combination with the source language coding representation to obtain vector representations with consistent lengths; performing softmax normalization operation on an output result of the decoder to obtain predicted distribution, and completing a training process of the model; and carrying out chapter-level machine translation by utilizing the trained model, sending the text into the model sentence by sentence for translation, and obtaining a translation result in an autoregression mode. The context memory module is added to dynamically maintain context memory information, related context information is introduced, and the problem that translation result contexts are inconsistent is solved.

Description

technical field [0001] The invention relates to a neural machine translation technology, in particular to a text-level neural machine translation method based on a context memory network. Background technique [0002] Machine Translation (MT for short) is an experimental discipline that uses computers to translate between natural languages. Using machine translation technology, a source language can be automatically converted into a target language. Machine translation, as a key technology to eliminate barriers in people's cross-language communication, has always been an important part of natural language processing research. Compared with human translation, machine translation is more efficient and lower cost, which is of great significance for promoting national unity and cultural exchanges. Machine translation technology can be summarized as two methods based on rationalism and methods based on empiricism. Since it was proposed in the 1940s, machine translation has expe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F40/58G06F40/56
Inventor 杜权朱靖波肖桐张春良
Owner 沈阳雅译网络技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products