Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method based on multi-step discriminant co-attention model for multi-label text classification

A text classification and multi-label technology, which is applied in text database clustering/classification, neural learning methods, biological neural network models, etc., can solve the problems of late prediction impact and failure to alleviate error accumulation, so as to improve representation ability and alleviate errors Accumulate problems and optimize the effect of the training process

Active Publication Date: 2020-05-15
SHANDONG UNIV
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method does not alleviate the problem of error accumulation, that is, in the case of a single prediction error, it will also affect the later prediction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method based on multi-step discriminant co-attention model for multi-label text classification
  • A method based on multi-step discriminant co-attention model for multi-label text classification
  • A method based on multi-step discriminant co-attention model for multi-label text classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0095] A method based on a multi-step discriminant Co-Attention model for multi-label text classification, such as figure 1 shown, including the following steps:

[0096] (1) Label data preprocessing: the label sequence is divided into leading labels and to-be-predicted labels. The leading labels refer to the labels that have been predicted, and the to-be-predicted labels refer to unpredicted new labels. Information fusion is performed on the leading labels and the original text. Make it meet the multi-label classification requirements of multi-step discrimination;

[0097] (2) Training word vectors; perform word vector training through the skip-gram model in word2vec, so that each word in the original text has a corresponding feature representation in the vector space; then perform downstream tasks of the model;

[0098] (3) Text feature extraction; the original text after step (2) word vector training is input into two-way LSTM model, carries out coding operation, extracts ...

Embodiment 2

[0104] A method for multi-label text classification based on a multi-step discriminant Co-Attention model according to embodiment 1, the difference is that in step (4), feature combinations, such as Figure 4 As shown, including mutual attention operation, difference operation, cascade operation; hidden layer state vector h output for text feature extraction N and the output sequence {w 1 ,w 2 ,...,w N} Input to the feature fusion module for mutual attention operation, difference operation and cascade operation, output sequence {w 1 ,w 2 ,...,w N} and leading tag feature sequence {l 1 , l 2 ,...,l M} After mutual attention operation, two feature vectors A with weight information are obtained respectively YS 、A SY ;A YS Represents the information corresponding to the leading label in the original text. This part of the information has no effect on predicting new labels, so we delete it, that is, in h N Delete A by differential operation based on YS , h N Get the or...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a method based on a multi-step discriminant Co-Attention model for multi-label text classification. Based on the algorithm adaptation method, by introducing the mutual attention mechanism between the original text information and the leading label, the leading label is realized in the text encoding process. The information filtering function in optimizes the training process, and the attention function of the original text content on the leading label further alleviates the problem of error accumulation caused by a single wrong prediction. Aiming at the characteristics of the multi-label text classification task, the present invention adopts the strategy of feature vector differential fusion and cascading fusion. Through the difference, the original text information on which the label to be predicted depends is highlighted, the supervision function of the label information is optimized, and the final encoding vector with comprehensive information and discrimination is obtained. Simultaneous modeling of original text information, leading label information, and to-be-predicted label information is realized.

Description

technical field [0001] The invention relates to a method for multi-label text classification based on a multi-step discriminant Co-Attention model, which belongs to the technical field of text classification. Background technique [0002] With the development of artificial intelligence technology represented by deep artificial neural network technology, traditional text classification technology has excellent performance and has been widely used in practical applications. In order to further improve the user experience of text classification tasks, multi-label text classification has gradually entered people's field of vision, and many researchers have carried out extensive and in-depth exploration and research in this field. [0003] In the process of research and application, multi-label classification tasks have many commonalities and essential differences compared with traditional multi-classification tasks. Compared with the single-label text classification task, accor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/35G06K9/62G06N3/04G06N3/08
CPCG06F16/355G06N3/049G06N3/08G06N3/045G06F18/2414
Inventor 李玉军马浩洋马宝森李泽强邓媛洁
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products