Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Backdoor attack method of video analysis neural network model

A neural network model and video analysis technology, applied in the field of neural network security, can solve problems such as model interference and reduce the success rate of backdoor neural network attacks, and achieve the effect of improving the success rate, strong practical operability, and reducing interference to the backdoor

Active Publication Date: 2020-06-09
FUDAN UNIV
View PDF4 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the current backdoor attack method, under these harsh conditions, the original features of the video will greatly interfere with the model, making it difficult for the model to capture the information of the backdoor pattern, which greatly reduces the impact of backdoor neural network attacks on video. success rate on

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Backdoor attack method of video analysis neural network model
  • Backdoor attack method of video analysis neural network model
  • Backdoor attack method of video analysis neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] Step 1: Pre-train a clean model. Given the dataset used by the attacker D , neural network model structure NN , we prepend the D through normal training on the NN (Since different model structures have different training methods, the conventional training method corresponding to the structure should be used for normal training) to get a clean and good model M , making M Normal prediction accuracy on clean datasets.

[0049] Step 2: Initialize the backdoor mode. Specifies the size, shape, and position of the backdoor in the video frame, i.e. the mask for the backdoor pattern mask . Then through random initialization, constant initialization, Gaussian distribution initialization, uniform distribution initialization and other methods to give a legal initial value to initialize the pixel value in the backdoor, and get the initial backdoor pattern trigger .

[0050] Step 3: Insert a backdoor pattern into the original video. Take out the video samples in the dat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of neural network security, and particularly relates to a backdoor attack method of a video analysis neural network model. For more severe backdoor attackimplementation environments such as a high sample dimension, a high frame resolution and a sparse data set of a video, a video backdoor pollution sample construction framework is used for carrying outbackdoor attack on a video analysis neural network model. The video backdoor pollution sample construction framework comprises three parts: a task-oriented model high-sensitivity video backdoor modegeneration algorithm, a feature fuzzy model low-sensitivity anti-noise video sample generation algorithm and a pollution sample generation and attack algorithm; gradient information is introduced fromtwo aspects of a backdoor mode and an original sample to establish association between an attack target value and the backdoor mode. The method provided by the invention has the advantages of high attack success rate, high secrecy, good robustness, good expansibility and the like, and has very good generalization in a video analysis neural network model.

Description

technical field [0001] The invention belongs to the technical field of neural network security, and in particular relates to a backdoor attack method of a video analysis neural network model. Background technique [0002] Deep neural networks are currently widely used in image recognition, natural language processing, video analysis and other fields. Despite the great success, in recent years, studies have found that neural networks are extremely vulnerable to neural network backdoor attacks due to their low transparency and poor interpretability, which will have serious consequences for face recognition, automatic driving, and medical diagnosis. A big security risk. Backdoor attack is a kind of data set pollution attack. The attacker inserts the backdoor pattern into the victim's training set samples in a specific way. When the victim is training, a connection will be established between the backdoor pattern and the attack target value, and the model will remember This ba...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06K9/00G06K9/62
CPCG06N3/08G06V20/40G06F18/214G06N3/048G06N3/044G06N3/045Y02T10/40G06F21/55G06F2221/034G06N7/01
Inventor 姜育刚赵世豪
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products