Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Gradient-Based Graph Adversarial Example Generation Method by Adding False Nodes for Document Classification

A technology against samples and document classification, applied in the field of artificial intelligence information security, can solve problems such as difficult to achieve, difficult to obtain, misleading target node classification results, etc.

Active Publication Date: 2021-06-29
ZHEJIANG UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the field of graph data, current research is to make the classification results of target nodes mislead by adding or deleting existing edge or node features.
But this method may be difficult to implement in actual scenarios. For example, in a social network, if you want to delete or add an edge between two users, you may need to obtain the login permissions of these users, but this is difficult to obtain in actual situations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Gradient-Based Graph Adversarial Example Generation Method by Adding False Nodes for Document Classification
  • A Gradient-Based Graph Adversarial Example Generation Method by Adding False Nodes for Document Classification
  • A Gradient-Based Graph Adversarial Example Generation Method by Adding False Nodes for Document Classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be further described in detail below with reference to the accompanying drawings and examples, and the following examples are intended to facilitate the understanding of the present invention, but will not afford any limits.

[0028] The overall process of the method of the present invention is figure 1 Indicated.

[0029] For a total of a total of Y-type tags (A, X), and a well-trained diagram node classification model M, first input the map data to the model M, calculate the classification result of each node, select the correct classification correct The node constitutes the target node set V, for each node V in the set V, the target tag of the assigned attack (the target tag is the error class label) constitutes the attack target (V, Y), thereby constituting the attack target set O, and | O | = (Y-1) * | V |, here | · | Indicates the size of the collection. For example, a 3-class drawing data, the size of the attack target set is twice the siz...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gradient-based method for generating graph adversarial samples by adding false nodes, including: (1) obtaining original graph data and graph node classification models to construct an attack target set; adding false nodes to the original graph data to obtain adversarial sample; (2) select the attack target from the attack target set (v * ,y * ); (3) Input the current adversarial sample into the classification model, and use the loss function to calculate the loss; (4) Calculate the gradient of the loss to the input adjacency matrix, and select the node corresponding to the element with the largest gradient value in the row corresponding to the false node Connect with the added false node to get a new adversarial sample; (5) input the new adversarial sample into the classification model, if the classification result is y * , the new adversarial example is the generated adversarial example, otherwise skip to step (3). Utilizing the present invention, the generated adversarial examples can effectively affect the classification results of the graph deep learning model, and are more feasible in practical applications.

Description

Technical field [0001] The present invention belongs to the field of artificial intelligence information security technology, in particular, relating to a gradient based on the graph of the graph to which the fake node is added to the anti-sample generation method. Background technique [0002] The figure in the chart is a graphic constructed from a number of given points and two points of connection, which is often used to describe some specific relationship between certain things, representing things, two points The line indicates a certain relationship between the two things. The figure G is an ordered binary group (V, E), where V is called a vertex set, that is, a set of all vertices in the figure, e calls the edge set, that is, the edges between all vertices set. Simply put, the vertex indicates something, and the relationship between things is indicated. In addition, the attribute graph may be represented as an ordered binary group (A, X), where A is referred to as an adjac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/084G06N3/045
Inventor 李莹陈裕尹建伟邓水光
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products