all InfoSec news
How to Make the Gradients Small Privately: Improved Rates for Differentially Private Non-Convex Optimization
Feb. 20, 2024, 5:11 a.m. | Andrew Lowy, Jonathan Ullman, Stephen J. Wright
cs.CR updates on arXiv.org arxiv.org
Abstract: We provide a simple and flexible framework for designing differentially private algorithms to find approximate stationary points of non-convex loss functions. Our framework is based on using a private approximate risk minimizer to "warm start" another private algorithm for finding stationary points. We use this framework to obtain improved, and sometimes optimal, rates for several classes of non-convex loss functions. First, we obtain improved rates for finding stationary points of smooth non-convex empirical loss functions. …
algorithm algorithms arxiv cs.cr cs.lg find framework functions loss math.oc non optimization points private privately risk simple start
More from arxiv.org / cs.CR updates on arXiv.org
A Privacy Preserving System for Movie Recommendations Using Federated Learning
2 days, 22 hours ago |
arxiv.org
Jobs in InfoSec / Cybersecurity
Information Security Engineers
@ D. E. Shaw Research | New York City
Technology Security Analyst
@ Halton Region | Oakville, Ontario, Canada
Senior Cyber Security Analyst
@ Valley Water | San Jose, CA
Computer and Forensics Investigator
@ ManTech | 221BQ - Cstmr Site,Springfield,VA
Senior Security Analyst
@ Oracle | United States
Associate Vulnerability Management Specialist
@ Diebold Nixdorf | Hyderabad, Telangana, India