all InfoSec news
Multi-Message Shuffled Privacy in Federated Learning. (arXiv:2302.11152v1 [cs.LG])
cs.CR updates on arXiv.org arxiv.org
We study differentially private distributed optimization under communication
constraints. A server using SGD for optimization aggregates the client-side
local gradients for model updates using distributed mean estimation (DME). We
develop a communication-efficient private DME, using the recently developed
multi-message shuffled (MMS) privacy framework. We analyze our proposed DME
scheme to show that it achieves the order-optimal
privacy-communication-performance tradeoff resolving an open question in [1],
whether the shuffled models can improve the tradeoff obtained in Secure
Aggregation. This also resolves an …
client client-side communication constraints distributed federated learning framework local message optimization order performance privacy private question server study under updates