A Neural Network Text Classification Method Fused with Multiple Knowledge Graphs

A knowledge map and neural network technology, applied in the fields of natural language processing and data mining, can solve problems such as inaccessible coverage, impact modeling, noise, etc., achieve reliable, accurate and robust classification, and improve understanding

Active Publication Date: 2021-11-02
FUZHOU UNIV
View PDF13 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, there are relatively few studies on enhancing the semantic modeling of deep neural networks with the help of knowledge graphs, and these studies are still relatively rough in combining and matching the information in the knowledge graphs with the text of the training set, which may easily lead to the introduction of too much content related to the text content of the training set. Irrelevant knowledge map information, thus forming noise and affecting modeling
In addition, most of the current research work only considers modeling on a single knowledge graph, which may not be able to cover more of the content of the training set text, and the information between different knowledge graphs can complement each other. Compared with a single knowledge graph, multi-knowledge Atlas can cover more training set text content

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Neural Network Text Classification Method Fused with Multiple Knowledge Graphs
  • A Neural Network Text Classification Method Fused with Multiple Knowledge Graphs
  • A Neural Network Text Classification Method Fused with Multiple Knowledge Graphs

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0055] figure 1 It is an implementation flowchart of a neural network text classification method fused with multi-knowledge graphs in the present invention. Such as figure 1 As shown, the method includes the following steps:

[0056] Step A: Input the text in the training set into the long short-term memory network to obtain the context vector of the text. Specifically include the following steps:

[0057] Step A1: For any text D, perform word segmentation processing, and use the word embedding tool to convert the words in the text into word vector form. The calculation formula is as follows:

[0058] v=W·v'

[0059] Among them, each word in the text is randomly initialized as a d'-dimensional real number vector v'; W is the word embedding matrix, W∈R d ×d′ , which is obtained from a large-scale corpus trained in a neural netwo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a neural network text classification method that integrates multiple knowledge graphs, comprising the following steps: inputting the text in the training set into a long-term and short-term memory network to obtain a context vector of the text; extracting entities from each text in the training set, in the knowledge graph Carry out entity matching; respectively calculate the attention weights of each matched entity and each relationship in the knowledge map under the context vector, obtain the overall entity vector and overall relationship vector of the text, and then obtain the fact triple vector; calculate the The fact triple vectors, calculate the attention weights of these fact triples, get the text representation vector and input it to the fully connected layer of the neural network, use the classifier to calculate the probability of each text belonging to each category to train the network; use the well-trained The deep neural network model predicts the category of the text to be predicted. This method improves the model's understanding of text semantics and can classify text content more reliably, accurately and robustly.

Description

technical field [0001] The invention relates to the technical fields of natural language processing and data mining, in particular to a neural network text classification method that integrates multiple knowledge graphs. Background technique [0002] Text classification (text categorization) technology is an important basis for information retrieval and text mining, and its main task is to determine its category according to the text content under the pre-given set of category labels (label). Text classification has a wide range of applications in natural language processing and understanding, information organization and management, content information filtering and other fields. In recent years, the research idea of ​​using deep learning to build a language model has gradually matured, which has greatly improved the feature quality of the text. Some scholars first proposed a sentence classification model based on convolutional neural network, which extracts features from ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/35G06F16/36G06K9/62
CPCG06F18/2413G06F18/24147
Inventor 陈羽中张伟智郭昆林剑
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products