Confrontation sample generation method based on belief attack and salient region disturbance limitation
A technology against sample and regional perturbation, applied in character and pattern recognition, biological neural network models, instruments, etc., can solve problems such as perceptible perturbation, low mobility, etc., to improve visual quality, improve mobility, and reduce confrontation perturbation Effect
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0050] Below in conjunction with embodiment the present invention is described in further detail:
[0051] Such as figure 1 As shown, the main content of the present invention is to propose an adversarial sample generation method based on belief attack and significant area perturbation limit, which can be used to detect the loopholes of DNN model, as an evaluation index of DNN model security, thereby improving DNN Robustness and safety of the model.
[0052] In order to make the technical method of the present invention clear, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.
[0053] Step S1: Provide the original image, where the original image is used as the training data of the DNN model.
[0054] In this embodiment, the original image comes from the ImageNet Validation data set, from which 1000 pictures of different categories are selected, almost all of which can be correctly classified by the ...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com