Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Input coding method for constructing modeling unit of neural machine translation model

A technology of machine translation and model modeling, which is applied in the field of machine translation and can solve problems such as reduced readability of translation results and unregistered neural translation models

Pending Publication Date: 2021-09-14
南京汉智文科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The appearance of out-of-set words will greatly reduce the readability of the translation results. This is the unregistered problem of the neural translation model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Input coding method for constructing modeling unit of neural machine translation model
  • Input coding method for constructing modeling unit of neural machine translation model
  • Input coding method for constructing modeling unit of neural machine translation model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The present invention will be described in detail below in conjunction with the accompanying drawings. An input encoding method for building a neural machine translation model modeling unit is based on an encoder-decoder architecture. The encoder is completely based on the attention mechanism and does not use complicated Recurrent Neural Network or Convolutional Neural Network. The encoder is composed of six layers of identical modules, each of which includes a multi-head self-attention network and a position-sensitive forward neural network; the decoder is also composed of six layers of identical modules . The decoder has the same structure as the LSTM-based decoder in the RNNSearch model, and uses a layer of LSTM network to read the hidden layer vector of the encoder and predict the target word sequence.

[0024] Before the encoder, the word-based modeling unit of the typical neural machine translation model with attention mechanism is reconstructed, and the modeling...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an input coding method for constructing a neural machine translation model modeling unit, which is based on an encoder-decoder structure with an attention mechanism, relates to the technical field of neural machine translation processing without clear word boundary languages. The method is characterized in that the method comprises the steps: respectively processing in-set words and out-of-set words encountered by a translation model, and adapting completely different coding modes for processing. for the in-set words, obtaining word representation by directly querying a word vector table; for the out-set words, splitting all sentences where the out-set words are located into corresponding character sequences, automatically synthesizing the word information from the character sequences through a bidirectional row convolution module, and taking the synthesized word information as a subsequent module input of a neural translation machine encoder.

Description

technical field [0001] The invention belongs to the technical field of machine translation, in particular to a word / word mixed sequence input encoding method for constructing a neural machine translation model modeling unit. Background technique [0002] Usually, the neural machine translation model with an attention mechanism uses words as the modeling unit, and uses a neural network to complete the conversion from the source language to the target language. It needs to maintain a dictionary (a set of common words) at the encoder and decoder. They are used to index the source language and target language words respectively. Due to resource constraints, dictionaries can only contain a limited number and cannot be expanded wirelessly. Words that are not in the dictionary are also called out-of-set words, usually represented by "UNK" (unknown words). The appearance of out-of-set words will greatly reduce the readability of the translation results. This is the unregistered pr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/126G06F40/284G06F40/289G06F40/30G06F40/58G06N3/04G06N3/08
CPCG06F40/126G06F40/58G06F40/30G06F40/284G06F40/289G06N3/08G06N3/044
Inventor 袁仲达滕俊平
Owner 南京汉智文科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products