March 3, 2023, 2:10 a.m. | Xin Gu, Gautam Kamath, Zhiwei Steven Wu

cs.CR updates on arXiv.org arxiv.org

Differentially private stochastic gradient descent privatizes model training
by injecting noise into each iteration, where the noise magnitude increases
with the number of model parameters. Recent works suggest that we can reduce
the noise by leveraging public data for private machine learning, by projecting
gradients onto a subspace prescribed by the public data. However, given a
choice of public datasets, it is not a priori clear which one may be most
appropriate for the private task. We give an algorithm …

data datasets machine machine learning magnitude model training noise private public training

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Information Security Engineers

@ D. E. Shaw Research | New York City

Information Security Manager & ISSO

@ Federal Reserve System | Minneapolis, MN

Forensic Lead

@ Arete | Hyderabad

Lead Security Risk Analyst (GRC)

@ Justworks, Inc. | New York City

Consultant Senior en Gestion de Crise Cyber et Continuité d’Activité H/F

@ Hifield | Sèvres, France