Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A sequence tagging model and method based on a fine-grained word representation model

A sequence annotation, fine-grained technology, applied in character and pattern recognition, instrumentation, computing, etc.

Active Publication Date: 2021-08-20
DALIAN UNIV OF TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the Attention mechanism has made some progress in NER tasks, how to effectively integrate the dynamics and globality of the Attention mechanism into the character-level model needs to be further explored.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A sequence tagging model and method based on a fine-grained word representation model
  • A sequence tagging model and method based on a fine-grained word representation model
  • A sequence tagging model and method based on a fine-grained word representation model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] The specific embodiments discussed are merely illustrative of implementations of the invention, and do not limit the scope of the invention. Embodiments of the present invention will be described in detail below in combination with technical solutions and accompanying drawings.

[0066] In order to represent the morphological information of words more accurately, the present invention designs a fine-grained word representation model Finger based on the Attention mechanism. At the same time, by combining Finger and BiLSTM-CRF model for sequence tagging tasks, ideal results have been achieved.

[0067] 1. Representation stage

[0068] In the representation stage, given an arbitrarily long sentence, the word vector representation and character vector representation of the corresponding word are respectively represented by formulas (1)-(6), and the word vector and character vector of the word sequence are connected by splicing.

[0069] 2. Coding stage

[0070] In the enc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a sequence tagging model based on a fine-grained word representation model, which is used for sequence tagging tasks and belongs to the fields of computer application and natural language processing. The model structure of the present invention is mainly composed of three parts: feature representation layer, BiLSTM and CRF layer. When using this model for sequence tagging tasks, a character-level word representation model Finger based on the attention mechanism is firstly proposed to fuse morphological information and word character information, and then the Finger and BiLSTM-CRF models jointly complete the sequence tagging task, and finally The method achieves an F1 of 91.09% on the CoNLL 2003 dataset in an end-to-end manner without any feature engineering. Experiments show that the Finger model designed by the present invention significantly improves the recall rate of the sequence tagging system, thereby significantly improving the recognition ability of the model.

Description

technical field [0001] The invention belongs to the fields of computer application and natural language processing, and relates to a character-level model based on an attention mechanism and its application in sequence labeling tasks. The invention proposes a sequence labeling model based on a fine-grained word representation model. The main innovation is to design a fine-grained word representation model based on attention mechanism to describe the morphological information of words more accurately, globally and dynamically, and then propose a sequence tagging model based on the word representation model. The sequence annotation model not only has high sequence annotation ability, but also does not require feature engineering, and has strong interpretability. Background technique [0002] Sequence tagging tasks such as Part-of-Speech Tagging and Named Entity Recognition (NER) are the basic work in the field of natural language processing. Taking NER as an example, its mai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F40/30G06K9/62
CPCG06F40/30G06F18/2415
Inventor 张绍武林广和杨亮林鸿飞
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products