Hierarchical structure-based neural-network machine translation model

A hierarchical structure and neural network technology, applied in the field of neural network machine translation models, can solve problems such as large amount of calculation of the attention mechanism, alignment divergence, gradient explosion, etc.

Inactive Publication Date: 2017-12-01
XIAMEN UNIV
View PDF4 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Since it is based on the recurrent neural network, the neural network machine translation model faces two problems: 1) The problem of gradient explosion and gradient disappearance
Although the existing variant models can solve this type of problem to a certain extent, it is still difficult to model long-distance context information when modeling long sequences; 2) The attention mechani

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hierarchical structure-based neural-network machine translation model
  • Hierarchical structure-based neural-network machine translation model
  • Hierarchical structure-based neural-network machine translation model

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0036] In the first step, according to the word-level alignment information as the segmentation constraint of the short clauses of the sentence, the classifier is trained using the training data of the divided clauses;

[0037] In the second step, the underlying cyclic neural network is used to encode word-level information in each clause to obtain the semantic representation of the clause;

[0038] The third step is to use the high-level recurrent neural network to encode the information between each clause to obtain the overall semantic representation of the sentence;

[0039] In the fourth step, in the decoder process, the bottom attention mechanism pays attention to the alignment information of the words in the clauses, and the high-level attention mechanism pays attention to the alignment information of each clause, and calculates The obtained translation probability is used as the objective function to train the whole neural network machine translation model.

[0040] T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a hierarchical structure-based neural-network machine translation model, and relates to natural language processing based on deep learning. A word alignment tool GIZA++ is used to carry out word alignment on parallel training sentence pairs, and then source language sentences are divided into clauses of monotonous translation according to punctuation and word alignment information; the above-mentioned obtained clause data are used to train a clause classifier; hierarchical structure modeling is carried out on the source language sentences of the parallel sentence pairs; and hierarchical structure decoding is carried out on target language sentences of the parallel sentence pairs. The sentences are divided into the clauses of monotonous translation, and then word-clauses-sentence hierarchical-modeling, attention mechanisms and decoding are carried out: and a bottom-layer recurrent neural network (GRU) encodes semantic representations of the clauses, an upper-layer recurrent neural network encodes information of the sentences, the bottom-layer attention mechanism is devoted to word-level alignment inside the clauses, and the upper-layer attention mechanism is devoted to clause-level alignment.

Description

technical field [0001] The present invention relates to natural language processing based on deep learning, in particular to a neural network machine translation model based on hierarchical structure. Background technique [0002] Natural language processing is an important research direction of computer science artificial intelligence. It studies how to enable effective communication between humans and computers using natural language. It is a subject that integrates linguistics, computer science, and mathematics. Among them, neural machine translation is a very important task. The existing neural network machine translation mainly includes two cyclic neural networks and a contextual semantic generation model based on the attention mechanism: use a cyclic neural network (called encoder Encoder) to learn the semantic representation of the input sentence, use another cyclic neural network The network (called Decoder) combines contextual semantic representations generated b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/28G06N3/02
CPCG06F40/58G06N3/02
Inventor 苏劲松曾嘉莉尹永竞
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products