all InfoSec news
Clip Body and Tail Separately: High Probability Guarantees for DPSGD with Heavy Tails
May 29, 2024, 4:12 a.m. | Haichao Sha, Yang Cao, Yong Liu, Yuncheng Wu, Ruixuan Liu, Hong Chen
cs.CR updates on arXiv.org arxiv.org
Abstract: Differentially Private Stochastic Gradient Descent (DPSGD) is widely utilized to preserve training data privacy in deep learning, which first clips the gradients to a predefined norm and then injects calibrated noise into the training procedure. Existing DPSGD works typically assume the gradients follow sub-Gaussian distributions and design various clipping mechanisms to optimize training performance. However, recent studies have shown that the gradients in deep learning exhibit a heavy-tail phenomenon, that is, the tails of the …
arxiv body cs.cr cs.lg data data privacy deep learning high noise privacy private procedure tail tails training training data
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
Ingénieur Développement Logiciel IoT H/F
@ Socomec Group | Benfeld, Grand Est, France
Architecte Cloud – Lyon
@ Sopra Steria | Limonest, France
Senior Risk Operations Analyst
@ Visa | Austin, TX, United States
Military Orders Writer
@ Advanced Technology Leaders, Inc. | Ft Eisenhower, GA, US
Senior Golang Software Developer (f/m/d)
@ E.ON | Essen, DE
Senior Revenue Operations Analyst (Redwood City)
@ Anomali | Redwood City, CA