Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Semi-supervised network representation learning model based on hierarchical attention mechanism

A network representation and learning model technology, applied in special data processing applications, unstructured text data retrieval, semantic analysis, etc., can solve problems such as undiscovered patent documents

Pending Publication Date: 2020-02-11
ELECTRIC POWER SCI & RES INST OF STATE GRID TIANJIN ELECTRIC POWER CO +1
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] Through the search of published patent documents, no published patent documents identical to this patent application have been found

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semi-supervised network representation learning model based on hierarchical attention mechanism
  • Semi-supervised network representation learning model based on hierarchical attention mechanism
  • Semi-supervised network representation learning model based on hierarchical attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0087] The present invention will be further described in detail below through the specific examples, the following examples are only descriptive, not restrictive, and cannot limit the protection scope of the present invention with this.

[0088] The present invention mainly adopts the theory and method related to natural language processing and network representation learning to carry out representation learning on paper citation network data. In order to ensure the training and testing of models, the computer platform used is required to be equipped with no less than 8G of memory, and the number of CPU cores Not less than 4, and install Python3.6 version, tensorflow framework and other necessary programming environments.

[0089] Such as figure 2 As shown, the semi-supervised network representation learning method based on the hierarchical attention mechanism provided by the present invention includes the following steps performed in sequence:

[0090] Step 1) Input the te...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a semi-supervised network representation learning model based on a hierarchical attention mechanism. The semi-supervised network representation learning model is characterizedby comprising the following steps: 1), word-level semantic coding; 2), sentence-level semantic encoding; 3) node text representation; 4) obtaining of a node structure representation vector and a noderepresentation vector; and 5) introduction of a node label under the semi-supervised framework. Based on a hierarchical attention mechanism, the semi-supervised network representation learning modellearns text representation of network nodes, introduces the node label information under the semi-supervised framework, finally, obtains high-quality representation vectors of the nodes, and improvesthe performance of downstream tasks (node classification and link prediction).

Description

technical field [0001] The invention belongs to the technical field of computer applications, and relates to a semi-supervised network representation learning model, in particular to a semi-supervised network representation learning model based on a hierarchical attention mechanism. Background technique [0002] The web is an efficient way to organize different kinds of information in the real world. With the development of information technology, a large amount of data with network structure has accumulated in the Internet. The analysis of these network structure data is of great significance to the development of various industries. The primary task of analyzing network structure data is to comprehensively utilize the feature information in the network to represent network nodes as a low-dimensional, dense vector, also known as network representation learning. After obtaining the representation vector of network nodes, it can be input into existing algorithms to complete...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/33G06F16/35G06F40/30
CPCG06F16/3344G06F16/35
Inventor 姚宗强崇志强刘杰徐福华周作静马世乾杨晓静郭悦尚学军王伟臣邓君怡李国栋霍现旭王旭东黄志刚吕金炳张文政张津沛苏立伟
Owner ELECTRIC POWER SCI & RES INST OF STATE GRID TIANJIN ELECTRIC POWER CO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products