Feb. 23, 2023, 2:10 a.m. | Antonious M. Girgis, Suhas Diggavi

cs.CR updates on arXiv.org arxiv.org

We study differentially private distributed optimization under communication
constraints. A server using SGD for optimization aggregates the client-side
local gradients for model updates using distributed mean estimation (DME). We
develop a communication-efficient private DME, using the recently developed
multi-message shuffled (MMS) privacy framework. We analyze our proposed DME
scheme to show that it achieves the order-optimal
privacy-communication-performance tradeoff resolving an open question in [1],
whether the shuffled models can improve the tradeoff obtained in Secure
Aggregation. This also resolves an …

client client-side communication constraints distributed federated learning framework local message optimization order performance privacy private question server study under updates

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Information Security Engineers

@ D. E. Shaw Research | New York City

Security Officer Level 1 (L1)

@ NTT DATA | Virginia, United States of America

Alternance - Analyste VOC - Cybersécurité - Île-De-France

@ Sopra Steria | Courbevoie, France

Senior Security Researcher, SIEM

@ Huntress | Remote US or Remote CAN

Cyber Security Engineer Lead

@ ASSYSTEM | Bridgwater, United Kingdom