Adversarial sample generation method and device

An anti-sample and iterative technology, applied in the computer field, can solve the problems of weak attack and blocking, and achieve the effect of strong attack and good attack effect

Active Publication Date: 2020-10-02
ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
View PDF7 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In the generation methods of adversarial samples in the prior art, small high-frequency perturbations are often added to the original image

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adversarial sample generation method and device
  • Adversarial sample generation method and device
  • Adversarial sample generation method and device

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0051] The following describes the solutions provided in this specification with reference to the drawings.

[0052] figure 1 This is a schematic diagram of an implementation scenario of an embodiment disclosed in this specification. This implementation scenario involves the generation of adversarial examples. Reference figure 1 The image recognition model is used to classify the input image. The original image belongs to category A. After adding interference to the original image, the adversarial sample is obtained. Because the interference is relatively small and the human eye cannot perceive it, the adversarial sample still belongs to the category in the human eye A, but input the adversarial sample into the image recognition model, the recognition result of the image recognition model is category B. This attack method that deliberately adds interference to the input samples, causing the model to give an incorrect output with high confidence, is called an adversarial attack. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Embodiments of the invention provide an adversarial sample generation method and device. The method comprises the steps of obtaining a to-be-enhanced current adversarial sample in the current iteration; in the decreasing direction of the target loss function, performing preset geometric deformation on the current adversarial sample for a first time to obtain a deformed image; executing second-timepixel-by-pixel updating on the deformed image to obtain a first adversarial sample; performing third pixel-by-pixel updating on the current adversarial sample to obtain a second adversarial sample; determining an adversarial sample with a smaller corresponding loss value in the first adversarial sample and the second adversarial sample as an updated adversarial sample; when the iteration stoppingcondition is met, taking the updated adversarial sample as a final adversarial sample; and when the iteration stopping condition is not met, performing the next round of iteration based on the updated adversarial sample. The generated adversarial sample can have stronger aggressivity, so that targeted defense is realized.

Description

technical field [0001] One or more embodiments of this specification relate to the computer field, and in particular, to a method and an apparatus for generating an adversarial example. Background technique [0002] With the large-scale application of image recognition models, attacks against image recognition models emerge in an endless stream. It is necessary to follow up research in time to discover potential attack methods and prevent dangers before they happen. Among many attack methods, adversarial attack is a new type of attack method with strong aggressiveness. Adversarial attacks obtain adversarial samples by intentionally adding interference to input samples, and through adversarial samples, the image recognition model gives a wrong output with high confidence. [0003] In the generation methods of adversarial samples in the prior art, small high-frequency perturbations are often added to the original image to generate adversarial samples. Such adversarial samples...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F21/55G06K9/00G06K9/62
CPCG06F21/55G06V40/168G06F18/214
Inventor 傅驰林黄启印周俊张晓露
Owner ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products