Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Scarce resource neural machine translation training method based on pre-training

A technology of machine translation and training methods, applied in the field of neural machine translation training, can solve the problems of poor translation effect of neural machine translation models, avoid polysemy problems, simplify the training process, and improve the robustness.

Active Publication Date: 2020-05-19
沈阳雅译网络技术有限公司
View PDF9 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In view of the lack of sufficient bilingual corpus in many actual scenarios in the machine translation system in the prior art, which leads to the poor translation effect of the neural machine translation model, the technical problem to be solved by the present invention is to provide a pre-trained neural machine based on scarce resources. The translation training method can make full use of monolingual corpus pre-training to shield the language model and other tasks in the case of insufficient bilingual corpus, and then integrate the information extracted by the pre-training model into the neural machine translation model to significantly improve the translation quality of the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scarce resource neural machine translation training method based on pre-training
  • Scarce resource neural machine translation training method based on pre-training
  • Scarce resource neural machine translation training method based on pre-training

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further elaborated below in conjunction with the accompanying drawings of the description.

[0053] The method of the invention optimizes the training process of scarce resource machine translation from the knowledge in the integrated pre-training model. This method uses massive monolingual data to pre-train the language model without increasing bilingual data, and integrates the information of the pre-trained model into the neural machine translation model, aiming to reduce the dependence of machine translation on bilingual corpus, and to Achieving high-quality translation performance in scarce resource scenarios.

[0054] The present invention proposes a machine translation training method based on pre-training scarce resources, comprising the following steps:

[0055] 1) Construct massive monolingual corpus, perform word segmentation and sub-word segmentation preprocessing process, use monolingual corpus pre-training language model base...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a scarce resource neural machine translation training method based on pre-training, which comprises the following steps of: constructing massive monolingual corpora, and performing word segmentation and sub-word segmentation preprocessing flows to obtain converged model parameters; constructing parallel corpora, randomly initializing parameters of a neural machine translation model, and enabling the sizes of a word embedding layer and a hidden layer of the neural machine translation model to be the same as that of a pre-trained language model; integrating the pre-training model into a neural machine translation model; training the neural machine translation model through the parallel corpora, so that a generated target statement is more similar to a real translationresult, and the training process of the neural machine translation model is completed; and sending the source statement input by the user into a neural machine translation model, and generating a translation result by the neural machine translation model through greedy search or bundle search. According to the method, knowledge in the monolingual data is fully utilized, and compared with a randomly initialized neural machine translation model, the translation performance can be obviously improved.

Description

technical field [0001] The invention relates to a neural machine translation training method, in particular to a pre-training-based scarce resource neural machine translation training method. Background technique [0002] Nowadays, neural machine translation technology has made great progress. Compared with previous rule-based and statistical-based machine translation models, neural machine translation can achieve better translation quality and more fluent translation results. However, a problem with neural machine translation is that it is extremely dependent on data. If there is enough training data, that is, bilingual corpus of source language and target language, then the translation model can achieve better translation quality, but in training In the case of scarce data, the translation model cannot achieve the desired effect. Scarce resource scenarios can generally be divided into language data scarcity and domain data scarcity. Languages ​​with sufficient bilingual ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/58G06N3/04G06N3/08
CPCG06N3/084G06N3/048G06N3/045Y02D10/00
Inventor 杜权朱靖波肖桐张春良
Owner 沈阳雅译网络技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products