Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Adversarial attack and defense method and system based on prediction correction and stochastic step size optimization

A predictive correction and adversarial technology, applied in machine learning, computing models, computing, etc., can solve the loss of adversarial samples, cannot accurately evaluate the effectiveness of machine learning model robustness anti-defense methods, and cannot guarantee optimal disturbance range and other issues to achieve the effect of improving robustness and high attack success rate

Inactive Publication Date: 2021-06-25
SUN YAT SEN UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This is mainly caused by two reasons. The first reason is that the complexity and nonlinearity of the deep neural network lead to the addition of perturbations, and the loss value of the generated adversarial samples does not necessarily change strictly along the gradient direction; the second reason It is the step size of each iteration that determines the magnitude of the perturbation, but in practice, neither the fixed step size nor the adaptive step size can guarantee the optimal perturbation magnitude, so that the generated adversarial samples have the largest loss value
Therefore, existing techniques cannot accurately evaluate the robustness of machine learning models and the effectiveness of adversarial defense methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adversarial attack and defense method and system based on prediction correction and stochastic step size optimization
  • Adversarial attack and defense method and system based on prediction correction and stochastic step size optimization
  • Adversarial attack and defense method and system based on prediction correction and stochastic step size optimization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0037] Such as figure 1 As shown, this embodiment conducts adversarial attacks and defensive attacks based on prediction correction and random step size optimization strategies, mainly involving the following technologies: 1) For adversarial attacks based on prediction correction and random step size optimization, the adversarial samples generated by existing methods As a predicted sample, the current disturbance is corrected using the gradient of the loss function relative to the predicted sample. At the same time, a random step is introduced in the process of generating adversarial samples, and the loss value of the sample obtained by the fixed step and random step is compared, and the sample with a larger loss value is selected as the adversarial sample. 2) Defense based on prediction correction and random step size optimization, using the adversarial samples generated by the adversarial attack method based on prediction correction and random step size optimization to condu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an adversarial attack and defense method and system based on prediction correction and stochastic step size optimization. The method comprises the following steps: inputting a training data set and a machine learning model; training a machine learning model according to the input training data set; judging whether the loss function converges or not; if the loss function is not converged, generating an adversarial sample by adopting an adversarial attack based on prediction correction and stochastic step size optimization, and using the adversarial sample and the original data as a training data set to train a machine learning model until the loss function is converged, and obtaining a trained machine learning model; and if the loss function converges, directly outputting a result. The confrontation samples are generated through confrontation attacks, the higher attack success rate can be achieved under the same disturbance constraint limitation, and the method can be used for evaluating the performance of a machine learning model and the effectiveness of a confrontation defense method; the generated adversarial sample carries out adversarial training on the machine learning model, so that various adversarial attacks can be effectively resisted, and the robustness of the model is improved.

Description

technical field [0001] The invention relates to the field of artificial intelligence machine learning, in particular to an adversarial attack and defense method and system based on prediction correction and random step size optimization. Background technique [0002] As deep learning has achieved remarkable results in many fields such as data mining, computer vision, natural language processing, and driverless driving, the robustness and stability of deep neural networks have attracted more and more attention. However, recent studies have confirmed that almost all machine learning models are vulnerable to adversarial examples. Attackers can obtain adversarial samples by adding some small perturbations to the original input samples. The perturbed adversarial samples and the original samples have the same category or attribute in the eyes of human observers, but it will mislead the neural network model to produce wrong predictions. output, which brings serious security issues...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N20/00
CPCG06N20/00
Inventor 黄方军万晨
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products