May 26, 2023, 1:19 a.m. | Salah Ghamizi, Jingfeng Zhang, Maxime Cordy, Mike Papadakis, Masashi Sugiyama, Yves Le Traon

cs.CR updates on arXiv.org arxiv.org

While leveraging additional training data is well established to improve
adversarial robustness, it incurs the unavoidable cost of data collection and
the heavy computation to train models. To mitigate the costs, we propose Guided
Adversarial Training (GAT), a novel adversarial training technique that
exploits auxiliary tasks under a limited set of training data. Our approach
extends single-task models into multi-task models during the min-max
optimization of adversarial training, and drives the loss optimization with a
regularization of the gradient curvature …

adversarial collection computation cost data data collection exploits novel robustness train training under

More from arxiv.org / cs.CR updates on arXiv.org

Toronto Transit Commission (TTC) - Chief Information Security Officer (CISO)

@ BIPOC Executive Search Inc. | Toronto, Ontario, Canada

Security Analyst, AWS Security Messaging

@ Amazon.com | Seattle, Washington, USA

Mid-Level DevSecOps Engineer

@ Sigma Defense | United States - Remote

Senior Consultant, Cyber - Threat Security Operations (SOC, M365 Defender)

@ NielsenIQ | Chennai, India

Senior Email Security Researcher (Cortex XDR)

@ Palo Alto Networks | Tel Aviv-Yafo, Israel

Application Security Engineer

@ 10x Banking | India - Remote