Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-feature-fusion Chines-text classification method based on Attention neural network

A technology of feature fusion and neural network, applied in the direction of neural learning method, biological neural network model, text database clustering/classification, etc., can solve the problem of not fully combining the advantages of the three algorithms, and achieve improved accuracy and improved recognition effect of ability

Active Publication Date: 2018-08-28
HAINAN NORMAL UNIV
View PDF5 Cites 76 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Existing research and applications have proved that LSTM (long-term short-term memory network) is suitable for learning the long-term dependencies between language units in sentences, and CNN (convolutional neural network) is suitable for learning local features of sentences, but the current research has not Fully combine the advantages of the three algorithms

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-feature-fusion Chines-text classification method based on Attention neural network
  • Multi-feature-fusion Chines-text classification method based on Attention neural network
  • Multi-feature-fusion Chines-text classification method based on Attention neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] Exemplary embodiments of the present invention will be described below with reference to the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the embodiments of the present invention, rather than to limit the embodiments of the present invention. In addition, it should be noted that, for the convenience of description, the drawings only show some but not all structures related to the embodiments of the present invention, and some parts in the drawings will be omitted, enlarged or reduced, and do not represent actual products size of.

[0047] The corpus used in this example was organized and produced by the Natural Language Processing Group of the International Database Center of the Department of Computer and Technology, Fudan University. The main process of preprocessing is as figure 1 shown. The corpus used contains 9833 Chinese documents, which are divided into 20 categories. Use 60% of the corp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A solution of the invention discloses a multi-feature-fusion Chines-text classification method based on Attention neural network, and belongs to the field of natural language processing. In order to further improve accuracy of Chinese-text classification, the method fully exploits features of text data under three different sizes of convolution kernel granularity through fusing three CNN paths; interconnections among the text data are manifested through fusing an LSTM path; and in particular, relatively important data features are enabled to play a greater role in a Chinese-text class recognition process through merging a provided Attention algorithm model, and thus recognition ability of a model on Chinese text classes is improved. Experiment results show that compared with a CNN model, an LSTM structure model and a combined model of the two parts under the same experiment conditions, the model provided by the invention is significantly improved in Chinese-text classification accuracy, and can be better applied to the Chinese-text classification field with high requirements on the classification accuracy.

Description

technical field [0001] The invention relates to the field of natural language processing, in particular to a multi-feature fusion Chinese text classification method based on Attention neural network. Background technique [0002] Chinese text classification is an important means to efficiently manage and mine massive Chinese text information on the Internet, and it is an important research direction in natural language processing. Since the 1990s, many researchers have begun to apply various statistical methods and machine learning methods to automatic text classification, such as support vector machine SVM, AdaBoost algorithm, naive Bayesian algorithm, KNN algorithm and Logistic regression, etc. In recent years, with the rapid development of deep learning and various neural network models, text classification methods based on deep learning have attracted close attention and research from academia and industry. Some typical neural network models, such as long short-term memo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06K9/62G06N3/04G06N3/08
CPCG06F16/35G06N3/08G06N3/048G06F18/25
Inventor 谢金宝侯永进殷楠楠谢桂芬王玉静梁新涛
Owner HAINAN NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products