all InfoSec news
Can Stochastic Gradient Langevin Dynamics Provide Differential Privacy for Deep Learning?. (arXiv:2110.05057v3 [cs.LG] UPDATED)
Feb. 3, 2022, 2:20 a.m. | Guy Heller, Ethan Fetaya
cs.CR updates on arXiv.org arxiv.org
Bayesian learning via Stochastic Gradient Langevin Dynamics (SGLD) has been
suggested for differentially private learning. While previous research provides
differential privacy bounds for SGLD at the initial steps of the algorithm or
when close to convergence, the question of what differential privacy guarantees
can be made in between remains unanswered. This interim region is of great
importance, especially for Bayesian neural networks, as it is hard to guarantee
convergence to the posterior. This paper shows that using SGLD might result …
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Security Officer Hospital - Major Hospital Account - Full-Time - Healthcare Security
@ Allied Universal | Anaheim, CA, United States
Product Security Lead
@ Lely | Maassluis, Netherlands
Summer Associate, IT Information Security (Temporary)
@ Vir Biotechnology, Inc. | San Francisco, California, United States
Director, Governance, Risk and Compliance - Corporate
@ Ryan Specialty | Chicago, IL, US, 60606
Cybersecurity Governance, Risk, and Compliance Engineer
@ Emerson | Shakopee, MN, United States