Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A multi-modal deep network embedding method for fusing structure and attribute information

A technology of attribute information and deep network, which is applied in the field of complex network analysis and can solve problems such as lack of

Inactive Publication Date: 2019-02-22
SHANGHAI JIAO TONG UNIV
View PDF0 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The deep model of this method has a strong representation learning ability and can describe the highly nonlinear structure of the network. However, this kind of hypothesis that describes the neighbor structure by sampling the node neighbors through the random walk of the node lacks a clear goal to illustrate the guarantee. What network structures and properties are similar to

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A multi-modal deep network embedding method for fusing structure and attribute information
  • A multi-modal deep network embedding method for fusing structure and attribute information
  • A multi-modal deep network embedding method for fusing structure and attribute information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0102] The following describes several preferred embodiments of the present invention with reference to the accompanying drawings, so as to make the technical content clearer and easier to understand. The present invention can be embodied in many different forms of embodiments, and the protection scope of the present invention is not limited to the embodiments mentioned herein.

[0103] The invention provides a multi-modal deep network embedding method for fusing structure and attribute information, comprising the following steps:

[0104] Step 1, t represents the t-th iteration, and the initial value t=0;

[0105] Step 2, the original structure information of the node and attribute information Perform preprocessing calculations to obtain high-order structural features y i s(1) and attribute feature y i a(1) ;

[0106] Specifically, step 2 includes:

[0107] Step 2.1. Establish an adjacency matrix describing the original structure information of the network in the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-mode depth network embedding method for fusing structure and attribute information, and relates to the technical field of complex network analysis. The method comprisesthe steps of establishing a network adjacency matrix and an attribute matrix, performing preprocessing, serially inputting structural features and attribute features into an encoder and a decoder, outputting a reconstructed adjacency matrix and an attribute matrix, updating a parameter iterative calculation and the like, and finally taking the output of the encoder as a final node representation.Based on the depth learning method, the invention can overcome the shortcoming that the existing shallow linear method is difficult to depict the highly nonlinear structure of the network, can map nodes in the network to a low-dimensional embedded space, and effectively maintains the structural characteristics and attribute characteristics of the nodes.

Description

technical field [0001] The invention relates to the technical field of complex network analysis, in particular to a multi-modal deep network embedding method that fuses structure and attribute information. Background technique [0002] Network embedding, also known as network representation learning, can reasonably represent network data as the input of machine learning algorithms, and is crucial to many complex network analysis tasks, such as node label classification and link prediction. Today's real networks are large in scale. The network embedding method uses low-dimensional vectors to represent nodes in the network while maintaining the original characteristics of the nodes, which can effectively reduce the storage space of the network and reduce the computational complexity of subsequent network analysis tasks. [0003] According to the literature search of the prior art, most of the network embedding methods can be divided into network embedding methods based on stru...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08
CPCG06N3/082
Inventor 潘理郑聪惠吴鹏
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products