Sequence labeling method based on multi-head self-attention mechanism

A technology of sequence labeling and attention, applied in neural learning methods, computer components, natural language data processing, etc.

Pending Publication Date: 2021-02-19
STATE GRID TIANJIN ELECTRIC POWER +1
View PDF5 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to provide a sequence labeling method based on a multi-head self-attention

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sequence labeling method based on multi-head self-attention mechanism
  • Sequence labeling method based on multi-head self-attention mechanism
  • Sequence labeling method based on multi-head self-attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0069]Example 1

[0070]The present invention first uses the bidirectional long and short-term memory unit (BLSTM) to learn the contextual semantic features of words in the text. Subsequently, based on the hidden representation learned by BLSTM, a multi-head self-attention mechanism is used to model the semantic relationship between any two words in the text, and then the global semantics that each word should be concerned with are obtained. In order to fully consider the complementarity of the local context semantics and the global semantics, the present invention designs three feature fusion methods to fuse the two parts of the semantics, and based on the fused features, the conditional random field model (CRF) is used to predict the label sequence.

Example Embodiment

[0071]Example 2

[0072]The present invention mainly adopts deep learning technology and natural language processing related theoretical methods to realize sequence labeling tasks. In order to ensure the normal operation of the system, in specific implementation, the computer platform used is required to be equipped with no less than 8G of memory and CPU cores. Not less than 4 and the main frequency is not less than 2.6GHz, GPU environment, Linux operating system, and install Python3.6 and above, pytorch0.4 and above and other necessary software environments.

[0073]Such asfigure 1 As shown, the sequence labeling method based on the multi-head self-attention mechanism provided by the present invention mainly includes the following steps executed in sequence:

[0074]Step 1. Local context semantic coding: Use the bidirectional long-term short-term memory network (BLSTM) to serially learn the local context semantic representation of words in the text.

[0075]Step 1.1) Use the Stanford NLP toolk...

Example Embodiment

[0093]Example 3

[0094]The sequence labeling method based on the multi-head self-attention mechanism mainly includes the following steps executed in order:

[0095]Step 1. Local context semantic coding: Use the bidirectional long-term short-term memory network (BLSTM) to serially learn the local context semantic representation of words in the text.

[0096]Step 1.1, use the Stanford NLP toolkit to segment the input text to obtain the corresponding word sequence X = {x1,x2,...,XN}.

[0097]For example, given the text "I participated in a marathon in Tianjin yesterday", the word sequence can be obtained after word segmentation {"我", "yesterday", "in", "Tianjin", "participate in", "了", "One game", "Marathon", "Game"}.

[0098]Step 1.2, considering that the words in the text usually contain rich morphological features, such as prefix and suffix information, this step is for each word in the word sequenceEncode each word x using the bidirectional LSTM (BLSTM) structureiCorresponding character-level ve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a sequence labeling method based on a multi-head self-attention mechanism, which comprises the following steps: step 1, local context semantic coding: learning local context semantic representation of words in a text by using BLSTM serialization, step 2, global semantic coding: based on the local context semantic representation of the words coded in the first step, coding global semantic representation of the words through a multi-head self-attention mechanism; step 3, semantic feature fusion: fusing the local context semantic representation encoded in the step 1 and the global semantic representation encoded in the step 2, and taking a fusion result as the input semantic feature of the step 4; step 4, sequence labeling: in order to fully consider the dependency relationship between labels in a sequence labeling task, utilizing CRF to predict the labels; step 5, performing model training; step 6, performing model reasoning. On the basis of the recurrent neural network, a multi-head self-attention mechanism is further introduced to learn global semantic representation of words, and therefore the sequence labeling effect is improved.

Description

technical field [0001] The invention relates to the technical field of computer applications, in particular to a sequence labeling method based on a multi-head self-attention mechanism. Background technique [0002] Sequence labeling is an important research topic in natural language processing tasks. Its goal is to predict the corresponding label sequence based on a given text sequence, mainly including named entity recognition (Named Entity Recognition, NER), chunk analysis (Text Chunking), Tasks such as Part-Of-Speech (POS) and opinion extraction (OpinionExtraction). [0003] Most of the early sequence labeling methods are based on rules, which require the establishment of rule templates and a large amount of expert knowledge, which consume a lot of manpower and material resources, and are not easy to expand and transplant to other fields. For example, Wang Ning and others used a rule-based method to artificially establish a knowledge base for financial company name reco...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F40/295G06F40/30G06F40/126G06F16/35G06K9/62G06N3/04G06N3/08
CPCG06F40/295G06F40/30G06F40/126G06F16/35G06N3/084G06N3/049G06N3/045G06F18/253
Inventor 孟洁李妍刘晨张倩宜王梓蒴单晓怡李慕轩王林刘赫董雅茹
Owner STATE GRID TIANJIN ELECTRIC POWER
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products