Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Node classification method, model training method and device

A node classification and node technology, applied in the Internet field, can solve the problems of high computing cost and large computing resource consumption, and achieve the effect of saving computing resources and reducing computing overhead

Pending Publication Date: 2019-03-12
TENCENT TECH (SHENZHEN) CO LTD
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in order to update the feature information of each node, GCN needs to traverse all the neighbor nodes of each node in one feature information calculation, resulting in high calculation cost and excessive calculation resource consumption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Node classification method, model training method and device
  • Node classification method, model training method and device
  • Node classification method, model training method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] Embodiments of the present invention provide a node classification method, a model training method, and a device. For a large-scale graph, training can be performed based on only a part of the nodes, and each iteration calculates a part of the nodes in the graph without traversing the nodes in the graph. Each node greatly reduces computing overhead and saves computing resources.

[0059] The terms "first", "second", "third", "fourth", etc. (if any) in the description and claims of the present invention and the above drawings are used to distinguish similar objects, and not necessarily Used to describe a specific sequence or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of practice in sequences other than those illustrated or described herein. Furthermore, the terms "comprising" and "having", as well as any variations thereof...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a model training method, which comprises the following steps: acquiring a node subset to be trained from a node set to be trained; the number of nodes in the node subset to betrained is smaller than the number of nodes in the node set to be trained; Determining a node characteristic vector quantum set according to a node subset to be trained; According to the subset of thetarget node and the characteristic of the node to the quantum set, the subset of the prediction class probability is determined, and the prediction class probability and the target node have the corresponding relation; Determining a second model parameter according to the predicted class probability subset and the first model parameter; A node classification model is obtained by training according to the parameters of the second model. The invention also discloses a node classification method and a server. As for that large-scale map, the invention can only carry out train based on a part ofnode, and part of nodes in the map are calculated in each iteration, and each node in the map need not be traversed, thereby greatly reducing computational overhead and save computational resources.

Description

technical field [0001] The invention relates to the technical field of the Internet, in particular to a node classification method, a model training method and a device. Background technique [0002] With the development of machine learning technology, image classification tasks have made remarkable progress, and related technologies have been applied to many scenarios such as autonomous driving, security, and games. Similar to pictures, graphs are also a common type of data, such as social networks, knowledge graphs, and drug molecular structures. Different from pictures, there are still relatively few studies on the use of machine learning methods to classify graph nodes. [0003] At present, for large-scale graphs, the integration of graph node information requires special iterative calculations. see figure 1 , figure 1 A schematic diagram of the graph convolutional neural network (graph convolutional networks, GCN) classification model for graph node classification i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/2415G06F18/214G06F18/213G06N3/08G06N3/045G06F18/2323G06F18/24143Y02D30/70G06F18/24147G06N7/01
Inventor 黄文炳荣钰黄俊洲
Owner TENCENT TECH (SHENZHEN) CO LTD
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More