Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-task named entity recognition method combining text classification

A technology for named entity recognition and text classification. It is applied in neural learning methods, character and pattern recognition, instruments, etc., and can solve problems such as poor neural network models.

Active Publication Date: 2020-05-29
ZHEJIANG UNIV
View PDF14 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the lack of training data in the biomedical field, neural network models often perform very poorly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-task named entity recognition method combining text classification
  • Multi-task named entity recognition method combining text classification
  • Multi-task named entity recognition method combining text classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0140] Taking the three public data sets (BioNLP13CG, BioNLP13PC and CRAFT) of the cell component group in the biomedical field as examples, the above method is applied to the three data sets for named entity recognition. The specific parameters and methods in each step are as follows: training text classifier :

[0141] 1. Each word in the input sentence is converted into a word vector with a dimension of 128 through the word embedding module. A sentence of length n can be expressed as 1:n =[x 1 ; x 2 ;…;x n ];

[0142] 2. The convolution kernel uses three sizes of 3, 4, and 5, each using 100, and the feature constructed by a sentence with a length of n is denoted as c;

[0143] 3. Select the maximum value of the feature in the pooling layer

[0144] 4. All the features are spliced ​​and input into the fully connected network, and the Softmax function is used for classification to construct a text classifier. When training the text classifier, the batch size is 64, t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-task named entity recognition method combining text classification. The method comprises the following steps: (1) constructing a text classifier by using a convolutional neural network, and measuring the similarity of texts; (2) selecting a proper threshold value, and for the data set of the auxiliary task, determining whether the data set participates in updating of sharing layer parameters or not according to comparison of a text classification result and the threshold value; (3) cascading the character vector of the text with a pre-trained word vector to serve as an input feature vector; (4) in a sharing layer, modeling the input feature vector of each word in the sentence by using bidirectional LSTM, and learning public features of each task; and (5) sequentially training each task in the task layer, transmitting the output of the sharing layer to a bidirectional LSTM neural network in the main task private layer or the auxiliary task private layer,performing label decoding on the whole sentence by utilizing a linear chain conditional random field, and labeling entities in the sentence. Experiments are carried out on data sets in multiple biomedical fields, so that the named entity recognition effect of a specific field with difficult corpus acquisition and high labeling cost can be effectively improved.

Description

technical field [0001] The invention relates to natural language processing, in particular to a multi-task named entity recognition method for joint text classification. Background technique [0002] Natural Language Processing (NLP) is an interdisciplinary subject integrating linguistics and computer science. Named Entity Recognition (NER) is a basic task in natural language processing, which aims to identify proper nouns and meaningful quantitative phrases in natural language texts and classify them. With the rise of information extraction and big data concepts, the task of named entity recognition has attracted increasing attention, and has become an important part of natural language processing such as public opinion analysis, information retrieval, automatic question answering, and machine translation. How to automatically, accurately and quickly identify named entities from massive Internet text information has gradually become a hot topic in academia and industry. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/216G06F40/289G06F40/295G06N3/04G06N3/08G06K9/62
CPCG06N3/08G06N3/044G06N3/045G06F18/24
Inventor 庄越挺浦世亮汤斯亮纪睿王凯吴飞
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products