Feb. 3, 2023, 2:10 a.m. | Lingxiao Wang, Bargav Jayaraman, David Evans, Quanquan Gu

cs.CR updates on arXiv.org arxiv.org

While many solutions for privacy-preserving convex empirical risk
minimization (ERM) have been developed, privacy-preserving nonconvex ERM
remains a challenge. We study nonconvex ERM, which takes the form of minimizing
a finite-sum of nonconvex loss functions over a training set. We propose a new
differentially private stochastic gradient descent algorithm for nonconvex ERM
that achieves strong privacy guarantees efficiently, and provide a tight
analysis of its privacy and utility guarantees, as well as its gradient
complexity. Our algorithm reduces gradient complexity …

algorithm analysis challenge functions loss minimization optimization privacy private risk solutions study training utility

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Information Security Engineers

@ D. E. Shaw Research | New York City

Security Engineer (SPLUNK) | Remote US

@ Coalfire | United States

Cyber - AppSec - Web PT2

@ KPMG India | Bengaluru, Karnataka, India

Ingénieur consultant expérimenté en Risques Industriels - Etude de dangers, QRA (F-H-X)

@ Bureau Veritas Group | COURBEVOIE, Ile-de-France, FR

Malware Intern

@ SentinelOne | Bengaluru, Karnataka, India