Register attack method and system for generating uniformly distributed disturbance by taking pulse as probability

A uniformly distributed and adversarial technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of large number of pixels, the extreme limitations of generated samples that do not consider adversarial attacks, and insufficient utilization. , to achieve the effect of reducing complexity, improving invisibility, and improving performance

Pending Publication Date: 2022-03-01
FUZHOU UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Researching the current related work found that the existing adversarial attack methods include using traditional PGD attacks, FGSM attacks, etc., as well as new attack strategies that increase, decrease, and reverse the pulse sequence, but these methods do not take into account the adversarial The ext

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Register attack method and system for generating uniformly distributed disturbance by taking pulse as probability
  • Register attack method and system for generating uniformly distributed disturbance by taking pulse as probability
  • Register attack method and system for generating uniformly distributed disturbance by taking pulse as probability

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0075] Please refer to figure 1 , the present invention provides an adversarial attack method that uses pulses as probability to generate uniformly distributed disturbances, such as figure 1 shown, including the following steps:

[0076] Step A: First preprocess the input image X, and then in the self-compiler F of the spiking neural network enc Use Poisson code conversion to convert it into a sequence of pulse data format γ=(γ 1 , gamma 2 ,…, γ T ), where T is the simulation time step;

[0077] Step A1: Perform maximum and minimum normalization on the input data X of the spiking neural network, that is, normalize the pixel value of each original pixel to x∈[0,1];

[0078] Step A2: Set a time step T, and set the pulse emission probability p=x at a time step T, thereby generating a pulse sequence γ with a time step T.

[0079] Step B: Based on the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an adversarial attack method for generating uniform distribution disturbance by taking pulse as probability, which comprises the following steps of: preprocessing a data set, converting the data set into a corresponding pulse sequence through Poisson coding, then constructing an alternative network corresponding to the data set, and taking the network as an original classifier; inputting a pulse sequence generated by coding of the original classifier and an original sample into a filter to screen out an optimal pulse; the selected pulse and the original image are input into an adversarial sample generation network, the adversarial sample generation network is based on an original classifier, noise obeying uniform distribution is generated with the pulse as the probability, and therefore an initial adversarial sample is generated; and continuously iteratively optimizing the adversarial samples to finally obtain adversarial samples. According to the invention, a low-cost adversarial attack tool for generating the spiking neural network can be provided for security defense researchers.

Description

technical field [0001] The invention relates to the fields of computer image recognition and malicious attacks, in particular to an adversarial attack method and system for generating uniformly distributed disturbances with pulses as probability. Background technique [0002] Spiking neural network (SNN), as the third generation of neural network, not only has broad prospects in simulating brain-like, but also has the advantages of low power consumption and high performance in neuromorphic computing systems. In the field of image recognition, SNN has demonstrated image recognition performance no less than that of deep neural network (DNN), and because it simulates the processing method of the human brain, its energy consumption is far lower than that of DNN. However, some studies have shown that although SNN is more robust than DNN, carefully constructed adversarial samples by adding invisible perturbations to the original image can make people invisible to the original imag...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/049G06N3/084G06N3/047G06F18/2415G06F18/214
Inventor 刘西蒙林璇威董晨程栋
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products