all InfoSec news
Federated Unlearning: How to Efficiently Erase a Client in FL?. (arXiv:2207.05521v1 [cs.LG])
July 13, 2022, 1:20 a.m. | Anisa Halimi, Swanand Kadhe, Ambrish Rawat, Nathalie Baracaldo
cs.CR updates on arXiv.org arxiv.org
With privacy legislation empowering users with the right to be forgotten, it
has become essential to make a model forget about some of its training data. We
explore the problem of removing any client's contribution in federated learning
(FL). During FL rounds, each client performs local training to learn a model
that minimizes the empirical loss on their private data. We propose to perform
unlearning at the client (to be erased) by reversing the learning process,
i.e., training a model …
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
Security Engineer
@ Celonis | Munich, Germany
Security Engineer, Cloud Threat Intelligence
@ Google | Reston, VA, USA; Kirkland, WA, USA
IT Security Analyst*
@ EDAG Group | Fulda, Hessen, DE, 36037
Scrum Master/ Agile Project Manager for Information Security (Temporary)
@ Guidehouse | Lagunilla de Heredia
Waste Incident Responder (Tanker Driver)
@ Severn Trent | Derby , England, GB
Risk Vulnerability Analyst w/Clearance - Colorado
@ Rothe | Colorado Springs, CO, United States