all InfoSec news
Private Stochastic Optimization in the Presence of Outliers: Optimal Rates for (Non-Smooth) Convex Losses and Extension to Non-Convex Losses. (arXiv:2209.07403v2 [cs.LG] UPDATED)
Oct. 19, 2022, 2:20 a.m. | Andrew Lowy, Meisam Razaviyayn
cs.CR updates on arXiv.org arxiv.org
We study differentially private (DP) stochastic optimization (SO) with data
containing outliers and loss functions that are (possibly) not Lipschitz
continuous. To date, the vast majority of work on DP SO assumes that the loss
is uniformly Lipschitz over data (i.e. stochastic gradients are uniformly
bounded over all data points). While this assumption is convenient, it is often
unrealistic: in many practical problems, the loss function may not be uniformly
Lipschitz. Even when the loss function is Lipschitz continuous, the …
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Security Engineering Professional
@ Nokia | India
Cyber Intelligence Exercise Planner
@ Peraton | Fort Gordon, GA, United States
Technical Lead, HR Systems Security
@ Sun Life | Sun Life Wellesley
SecOps Manager *
@ WTW | Thane, Maharashtra, India
Consultant Appels d'Offres Marketing Digital
@ Numberly | Paris, France