Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multivariate feature fusion Chinese text classification method based on attention neural network

A technology of feature fusion and neural network, applied in the direction of neural learning method, biological neural network model, text database clustering/classification, etc., can solve the problem of not fully combining the advantages of the three algorithms, and achieve improved accuracy and improved recognition effect of ability

Active Publication Date: 2022-03-01
HAINAN NORMAL UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Existing research and applications have proved that LSTM (long-term short-term memory network) is suitable for learning the long-term dependencies between language units in sentences, and CNN (convolutional neural network) is suitable for learning local features of sentences, but the current research has not Fully combine the advantages of the three algorithms

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multivariate feature fusion Chinese text classification method based on attention neural network
  • Multivariate feature fusion Chinese text classification method based on attention neural network
  • Multivariate feature fusion Chinese text classification method based on attention neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] Exemplary embodiments of the present invention will be described below with reference to the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the embodiments of the present invention, rather than to limit the embodiments of the present invention. In addition, it should be noted that, for the convenience of description, the drawings only show some but not all structures related to the embodiments of the present invention, and some parts in the drawings will be omitted, enlarged or reduced, and do not represent actual products size of.

[0047] The corpus used in this example was organized and produced by the Natural Language Processing Group of the International Database Center of the Department of Computer and Technology, Fudan University. The main process of preprocessing is as figure 1 shown. The corpus used contains 9833 Chinese documents, which are divided into 20 categories. Use 60% of the corp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The solution of the invention discloses a multi-feature fusion Chinese text classification method based on an Attention neural network, which belongs to the field of natural language processing. In order to further improve the accuracy of Chinese text classification, the present invention fully excavates the characteristics of text data under three different sizes of convolution kernel granularity by fusing three CNN paths; embodies the interconnection between text data by fusing LSTM paths; especially Ground, by fusing the proposed Attention algorithm model, relatively important data features can play a greater role in the process of Chinese text category recognition, thereby improving the model's ability to identify Chinese text categories. The experimental results show that under the same experimental conditions, compared with the CNN model, the LSTM structure model and the combined model of the two, the Chinese text classification accuracy of the model proposed by the present invention is significantly improved, and can be better applied to the classification accuracy The demanding field of Chinese text classification.

Description

technical field [0001] The invention relates to the field of natural language processing, in particular to a multi-feature fusion Chinese text classification method based on Attention neural network. Background technique [0002] Chinese text classification is an important means to efficiently manage and mine massive Chinese text information on the Internet, and it is an important research direction in natural language processing. Since the 1990s, many researchers have begun to apply various statistical methods and machine learning methods to automatic text classification, such as support vector machine SVM, AdaBoost algorithm, naive Bayesian algorithm, KNN algorithm and Logistic regression, etc. In recent years, with the rapid development of deep learning and various neural network models, text classification methods based on deep learning have attracted close attention and research from academia and industry. Some typical neural network models, such as long short-term memo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/35G06K9/62G06N3/04G06N3/08
CPCG06F16/35G06N3/08G06N3/048G06F18/25
Inventor 谢金宝侯永进殷楠楠谢桂芬王玉静梁新涛
Owner HAINAN NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products