July 13, 2022, 1:20 a.m. | Anisa Halimi, Swanand Kadhe, Ambrish Rawat, Nathalie Baracaldo

cs.CR updates on arXiv.org arxiv.org

With privacy legislation empowering users with the right to be forgotten, it
has become essential to make a model forget about some of its training data. We
explore the problem of removing any client's contribution in federated learning
(FL). During FL rounds, each client performs local training to learn a model
that minimizes the empirical loss on their private data. We propose to perform
unlearning at the client (to be erased) by reversing the learning process,
i.e., training a model …

client lg

Security Engineer

@ Celonis | Munich, Germany

Security Engineer, Cloud Threat Intelligence

@ Google | Reston, VA, USA; Kirkland, WA, USA

IT Security Analyst*

@ EDAG Group | Fulda, Hessen, DE, 36037

Scrum Master/ Agile Project Manager for Information Security (Temporary)

@ Guidehouse | Lagunilla de Heredia

Waste Incident Responder (Tanker Driver)

@ Severn Trent | Derby , England, GB

Risk Vulnerability Analyst w/Clearance - Colorado

@ Rothe | Colorado Springs, CO, United States