Check patentability & draft patents in minutes with Patsnap Eureka AI!

Dynamic network embedded link prediction method based on variational auto-encoder

A self-encoder and dynamic network technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve the problems of not explaining the essence of network evolution, ignoring network evolution, and narrow application scenarios

Active Publication Date: 2021-03-05
TIANJIN UNIV
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in the real world, the network often evolves dynamically, and the network that only focuses on a certain static moment tends to ignore the evolution of the network, making the application scenarios narrow
Based on this, some representation learning methods for dynamic networks pay more attention to the evolution of the network, making the classification and link prediction of models in dynamic scenarios better than traditional static methods. These dynamic methods make up for the shortcomings of static methods in describing network changes, but The essential mechanism of network evolution is still not explained, nor is the essential connection between the original network and the hidden vector found

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic network embedded link prediction method based on variational auto-encoder
  • Dynamic network embedded link prediction method based on variational auto-encoder
  • Dynamic network embedded link prediction method based on variational auto-encoder

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention.

[0033] refer to Figure 1-2 , a dynamic network embedding link prediction method based on a variational autoencoder, including the following steps:

[0034] S1: Obtain academic network or social network data streams for a period of time, and perform data preprocessing according to a certain period of fine-grained granularity;

[0035] Specifically, the network for a period of time is first divided into several slices, and the network can be expressed as in It is a collection of all network slices, T represents the number of slices, and the network topology of each slice can construct an undirected graph, in which the nodes in the graph are each user of the networ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dynamic network embedded link prediction method based on a variational auto-encoder, which comprises the following steps of S1, obtaining academic network or social network data streams for a period of time, and performing data preprocessing according to a certain time fine granularity; S2, establishing an encoding and decoding framework by using a variational auto-encoder, and encoding the network data of each time slice to obtain a low-dimensional vector; and S3, applying a self-attention constraint to the low-dimensional vector output by each node in the network byutilizing the neighborhood between the nodes in the current network. According to the method, the core mechanism of dynamic network evolution is discovered by mining the potential relationship between the original network and the implicit vector, so that the information of the network in space and time is effectively utilized, the defects of the traditional static method are overcome, and the intermediate implicit vector generated by the auto-encoder is constrained; spatial information and time information of the network are effectively utilized.

Description

technical field [0001] The invention relates to the technical field of deep learning model methods, in particular to a dynamic network embedding link prediction method based on a variational autoencoder. Background technique [0002] Complex systems can be represented as networks, such as protein networks, social networks, communication networks, and co-author networks. The subtasks of network analysis usually include node classification, community discovery, link prediction, recommendation systems, etc. Network representation learning or network embedding is to reduce the complexity of the original network and improve the research efficiency of downstream tasks by mapping the high-dimensional original network into a low-dimensional space, while the representation learning of the dynamic network can use the historical information of the network to predict the next The Evolution of Moment Networks. [0003] Representation learning for complex networks can map a high-dimensio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/082G06N3/049G06N3/044G06N3/045
Inventor 荆鑫
Owner TIANJIN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More