How does the Memorization of Neural Networks Impact Adversarial Robust Models?
Abstract
Supplementary Material
- Download
- 2.88 MB
References
Index Terms
- How does the Memorization of Neural Networks Impact Adversarial Robust Models?
Recommendations
CNN adversarial attack mitigation using perturbed samples training
AbstractSusceptibility to adversarial examples is one of the major concerns in convolutional neural networks (CNNs) applications. Training the model with adversarial examples, known as adversarial training, is a common countermeasure to tackle such ...
A prompt-based approach to adversarial example generation and robustness enhancement
AbstractRecent years have seen the wide application of natural language processing (NLP) models in crucial areas such as finance, medical treatment, and news media, raising concerns about the model robustness and vulnerabilities. We find that prompt ...
On the limitations of adversarial training for robust image classification with convolutional neural networks
AbstractAdversarial Training has proved to be an effective training paradigm to enforce robustness against adversarial examples in modern neural network architectures. Despite many efforts, explanations of the foundational principles underpinning the ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
- General Chairs:
- Ambuj Singh,
- Yizhou Sun,
- Program Chairs:
- Leman Akoglu,
- Dimitrios Gunopulos,
- Xifeng Yan,
- Ravi Kumar,
- Fatma Ozcan,
- Jieping Ye
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Funding Sources
- NSF
- Army Research Office (ARO)
Conference
Acceptance Rates
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 444Total Downloads
- Downloads (Last 12 months)231
- Downloads (Last 6 weeks)26
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in