Multi-mechanism attention merging multi-path neural machine translation method

A technology of machine translation and attention, applied in neural learning methods, neural architecture, natural language translation, etc., can solve problems such as inconsistencies in attention calculation results, and achieve good translation results

Pending Publication Date: 2021-02-05
KUNMING UNIV OF SCI & TECH
View PDF1 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Under different attention generation mechanisms, the calculation results of their attention are inconsistent, and the attention generated by any single mechanism cannot fully and accurately reflect the theoretical attention in language

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-mechanism attention merging multi-path neural machine translation method
  • Multi-mechanism attention merging multi-path neural machine translation method
  • Multi-mechanism attention merging multi-path neural machine translation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] Embodiment 1: In this example, German and English materials are used as the translation corpus, and the selected multi-decision methods are CNN translation mechanism, Transformer translation mechanism, and Tree-Transformer translation mechanism.

[0033] Such as Figure 1-2 As shown, the multi-path neural machine translation method of multi-mechanism combined attention, the specific steps of the method are as follows:

[0034] Model building process:

[0035] Step1. Download German and English materials from the website, and determine the multiple translation mechanisms used;

[0036] Step2. Preprocess the training corpus: use MOSES to perform word segmentation, lowercase processing and data cleaning on the bilingual corpus, and finally keep sentence pairs with a length of less than 175, and then use the BPE algorithm to perform word segmentation processing on all the preprocessed data;

[0037] Step3, generate training set, verification set and test set: randomly ext...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a multi-mechanism attention merging multi-path neural machine translation method, and belongs to the field of natural language processing. According to the invention, a CNN translation mechanism, a Transformer translation mechanism and a Tree-Transformer translation mechanism independently generate attention values of themselves, weighted accumulation is carried out on thecalculated attention values, then alignment and normalization are carried out to form new attention values, the new attention values are transmitted to a Dec-Enc attention layer of a decoder, and allthe translation mechanisms complete the subsequent machine translation process to obtain a decoded key-value matrix. And weighted stacking and normalization are performed by adopting a decoding key-value matrix generated by each mechanism, and thus generating a target translation through a linear transformation layer and a softmax layer. According to the multi-mechanism attention superposition and normalization process, the analysis capability of multiple algorithms can be effectively integrated, the formed attention is closer to the real attention of the theory, so that a better translationeffect is obtained, and the translation accuracy can be effectively improved.

Description

technical field [0001] The invention relates to a multi-path neural machine translation method with multi-mechanism combined attention, and belongs to the field of natural language processing. Background technique [0002] Machine translation refers to the process of using computers to translate sentences in one language (sentences in the source language) into sentences in another language (sentences in the target language) with the same meaning, and has become an important research direction in the field of artificial intelligence. [0003] In the prior art, Gehring et al. proposed a CNN translation mechanism to realize machine translation, which fully utilizes a convolutional neural network to realize machine translation, and uses a convolutional neural network as a working unit of an encoder and a decoder, wherein the encoder and decoder All devices are composed of multi-layer convolutional neural network stacks. On the encoding side, the input sequence is encoded using ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F40/58G06F40/211G06F40/289G06N3/04G06N3/08
CPCG06F40/58G06F40/211G06F40/289G06N3/08G06N3/045
Inventor 范洪博郑棋
Owner KUNMING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products