all InfoSec news
Gradient Sparsification for Efficient Wireless Federated Learning with Differential Privacy. (arXiv:2304.04164v1 [cs.DC])
cs.CR updates on arXiv.org arxiv.org
Federated learning (FL) enables distributed clients to collaboratively train
a machine learning model without sharing raw data with each other. However, it
suffers the leakage of private information from uploading models. In addition,
as the model size grows, the training latency increases due to limited
transmission bandwidth and the model performance degrades while using
differential privacy (DP) protection. In this paper, we propose a gradient
sparsification empowered FL framework over wireless channels, in order to
improve training efficiency without sacrificing …
addition bandwidth clients convergence data design differential privacy distributed efficiency empowered federated learning framework information latency machine machine learning order performance privacy private protection random sharing size train training transmission wireless