May 13, 2024, 4:11 a.m. | Jie Xu, Karthikeyan Saravanan, Rogier van Dalen, Haaris Mehmood, David Tuckey, Mete Ozay

cs.CR updates on

arXiv:2405.06368v1 Announce Type: cross
Abstract: Federated learning (FL) allows clients in an Internet of Things (IoT) system to collaboratively train a global model without sharing their local data with a server. However, clients' contributions to the server can still leak sensitive information. Differential privacy (DP) addresses such leakage by providing formal privacy guarantees, with mechanisms that add randomness to the clients' contributions. The randomness makes it infeasible to train large transformer-based models, common in modern IoT systems. In this work, …

adaptation addresses arxiv can clients cs.dc cs.lg data device differential privacy dynamic federated federated learning fine-tuning global information internet internet of things iot leak local low privacy private sensitive sensitive information server sharing system things train under

Sr. Product Manager

@ MixMode | Remote, US

Information Security Engineers

@ D. E. Shaw Research | New York City

Technology Security Analyst

@ Halton Region | Oakville, Ontario, Canada

Senior Cyber Security Analyst

@ Valley Water | San Jose, CA

Engineer I, S/W QA Cyber Security

@ Boston Scientific | Pune, IN

Application Security and Secure-SDLC Expert

@ CYE | Herzliya, Israel