March 15, 2023, 1:10 a.m. | Fu Wang, Yanghao Zhang, Yanbin Zheng, Wenjie Ruan

cs.CR updates on arXiv.org arxiv.org

Adversarial training is an effective but time-consuming way to train robust
deep neural networks that can withstand strong adversarial attacks. As a
response to its inefficiency, we propose Dynamic Efficient Adversarial Training
(DEAT), which gradually increases the adversarial iteration during training. We
demonstrate that the gradient's magnitude correlates with the curvature of the
trained model's loss landscape, allowing it to reflect the effect of
adversarial training. Therefore, based on the magnitude of the gradient, we
propose a general acceleration strategy, …

adversarial adversarial attacks attacks consuming curvature dynamic loss magnitude networks neural networks response train training

CyberSOC Technical Lead

@ Integrity360 | Sandyford, Dublin, Ireland

Cyber Security Strategy Consultant

@ Capco | New York City

Cyber Security Senior Consultant

@ Capco | Chicago, IL

Sr. Product Manager

@ MixMode | Remote, US

Security Compliance Strategist

@ Grab | Petaling Jaya, Malaysia

Cloud Security Architect, Lead

@ Booz Allen Hamilton | USA, VA, McLean (1500 Tysons McLean Dr)