Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for fusing bilingual predefined translation pair into neural machine translation model

A machine translation and pre-defined technology, applied in the field of neural machine translation, can solve the problems of not being able to guarantee the pre-defined phrases, reduce the speed of translation, modify the complexity, etc., and achieve the possible effect of increasing the possibility of being successfully translated

Pending Publication Date: 2019-09-10
SUZHOU UNIV
View PDF3 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Modifying the model architecture solves the problem of phrases, but it does not guarantee that the predefined phrases must appear on the target side, and the modification is more complicated and difficult to reproduce
Although the method of modifying beam search (Beam Search) does not need to modify the structure of the network, when generating each word, it is necessary to decide whether to use the information in the bilingual predefined translation pair according to the current translation, which greatly reduces the cost of translation. speed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for fusing bilingual predefined translation pair into neural machine translation model
  • Method for fusing bilingual predefined translation pair into neural machine translation model
  • Method for fusing bilingual predefined translation pair into neural machine translation model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments, so that those skilled in the art can better understand the present invention and implement it, but the examples given are not intended to limit the present invention.

[0027] Background: NMT model based on attention mechanism (attention)

[0028] In neural machine translation systems, an end-to-end model is generally used. The model generally includes an encoder (encoder) and a decoder (decoder). When translating, the general input is converted into a sentence representation through the encoder and input into the decoder. The decoder part accepts the input of the encoder and combines other mechanisms (such as the attention mechanism) to output the result, outputting a word each time, and inputting the word into In the decoder, it is used as the input for the next output word. So, until the translation of this sentence ends.

[0029] T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for integrating a bilingual predefined translation pair into a neural machine translation model. The invention relates to a method for fusing bilingual predefined translation pairs into a neural machine translation model, which is applied to an NMT model based on an attention mechanism and using an encoder-decoder framework, and includes a target to fuse bilingualpredefined translation pairs (p, q) into a neural machine translation model, wherein the p appears in a sentence of a source end and the p needs to be correctly translated into q and appears in a sentence of a target end, and other words in the source end are correctly translated. The method has the beneficial effects that a sample is introduced into the neural machine translation model so as to guide the translation of the neural machine translation model, and the method has the following advantages that 1, the model learns the mode by using a tagging method, the ground connection of predefined phrases of a source end and a target end is established, and the possibility that the source end and the target end are successfully translated is improved.

Description

technical field [0001] The invention relates to the field of neural machine translation, in particular to a method for integrating bilingual predefined translation pairs into a neural machine translation model. Background technique [0002] Machine translation (Machine Translation, MT) is an important field in the field of natural language processing (NLP), which aims to use machines to translate one language into another. After years of development, MT has evolved from a rule-based method to a statistical method, and then to today's neural network-based neural machine translation (Neural Machine Translation, NMT). Generally speaking, like many other mainstream NLP tasks, NMT also adopts a sequence to sequence structure (Sequence to sequence, seq2seq), consisting of an encoder and a decoder. The encoder (encoder) encodes the sentence at the source into a vector representation, and then the decoder (decoder) generates the corresponding translation word by word according to t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/28G06N20/00
CPCG06N20/00G06F40/58
Inventor 熊德意王涛
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products