all InfoSec news
Private Stochastic Optimization in the Presence of Outliers: Optimal Rates for (Non-Smooth) Convex Losses and Extension to Non-Convex Losses. (arXiv:2209.07403v1 [cs.LG])
Sept. 16, 2022, 1:20 a.m. | Andrew Lowy, Meisam Razaviyayn
cs.CR updates on arXiv.org arxiv.org
We study differentially private (DP) stochastic optimization (SO) with data
containing outliers and loss functions that are not Lipschitz continuous. To
date, the vast majority of work on DP SO assumes that the loss is Lipschitz
(i.e. stochastic gradients are uniformly bounded), and their error bounds scale
with the Lipschitz parameter of the loss. While this assumption is convenient,
it is often unrealistic: in many practical problems where privacy is required,
data may contain outliers or be unbounded, causing some …
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Security Architect - Hardware
@ Intel | IND - Bengaluru
Elastic Consultant
@ Elastic | Spain
OT Cybersecurity Specialist
@ Emerson | Abu Dhabi, United Arab Emirates
Security Operations Program Manager
@ Kaseya | Miami, Florida, United States
Senior Security Operations Engineer
@ Revinate | Vancouver