Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A semi-supervised network representation learning algorithm based on deep compression self-encoder

A self-encoder and network representation technology, applied in the field of network representation learning, can solve limitations and other problems, and achieve the effect of improving generalization ability

Inactive Publication Date: 2019-01-08
SOUTHEAST UNIV
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to overcome the deficiencies in the prior art, the present invention provides a semi-supervised network representation learning algorithm based on deep compression autoencoder, which is used to solve the limitations of existing network representation learning algorithms

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A semi-supervised network representation learning algorithm based on deep compression self-encoder
  • A semi-supervised network representation learning algorithm based on deep compression self-encoder
  • A semi-supervised network representation learning algorithm based on deep compression self-encoder

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] This embodiment runs on the Windows platform, the operating system is 64-bit windows10, the CPU uses E3 1231V3, the memory is 32G, the GPU used is NVIDIA GTX970, the memory capacity is 3.5G, all algorithms are written in python language, the basic configuration is as follows 2:

[0054] Table 2 Experimental environment configuration

[0055]

[0056] The data set used is shown in Table 3:

[0057] Table 3 Data set

[0058]

[0059] In this embodiment, the test set occupies 10% to 90% of the total data set. After the node representation vector of the network is obtained, this vector is used as input data, and multi-class logistic regression is used to classify the nodes. The experimental index adopts the micro-average Micro-F1, the experimental results are obtained after 10 times of classification results are averaged, and the baseline is the best value of the experimental results in their respective papers. The baseline algorithm used for comparison is as follows:

[0060] ●De...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a semi-supervised network representation learning algorithm LSDNE (Labeled Structural Deep Network Embedding) based on a deep compression self-encoder. The method comprises thefollowing steps: building a model, pre-training the input data with a deep belief network (DBN) to obtain the initial values of the model parameters, and taking the adjacency matrix and Laplace matrix of the network as inputs; encoding the network by a self-encoder with deep compression, and obtaining the global structure of the node; using Laplacian feature mapping, and obtaining the local structural features of nodes; using an SVM classifier to classify the known label nodes and optimize the whole model; using the Adam optimization model and obtaining a representation of the node. The invention can simultaneously use the structure information of the network and the label information of the node to carry out network representation learning, and the deep learning model is used, so that the performance of the representation of the node on the label classification task is better than the existing algorithm. Deep compression self-encoder can reduce the over-fitting phenomenon and make the model have better generalization performance.

Description

Technical field [0001] The present invention relates to network representation learning, in particular to a semi-supervised network representation learning algorithm LSDNE (Labeled Structural Deep Network Embedding) using label data and based on a deep learning model. Background technique [0002] Network representation learning is to represent a high-dimensional network in a low-dimensional vector space, aiming to obtain and retain the structural information of the network. A network, its structure can be applied to various tasks, such as community discovery, edge prediction, node classification, etc. However, the traditional network analysis technology has its limitations. First, the complexity of traditional methods is too high to be applied to modern large-scale networks; second, in modern large-scale networks, the link relationship between nodes is very complicated and difficult to observe, which leads to traditional analysis methods It is difficult to give a solution to q...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N99/00G06K9/62
CPCG06F18/2411G06F18/214
Inventor 何洁月武文茂
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products