all InfoSec news
Privacy Amplification via Compression: Achieving the Optimal Privacy-Accuracy-Communication Trade-off in Distributed Mean Estimation. (arXiv:2304.01541v1 [stat.ML])
cs.CR updates on arXiv.org arxiv.org
Privacy and communication constraints are two major bottlenecks in federated
learning (FL) and analytics (FA). We study the optimal accuracy of mean and
frequency estimation (canonical models for FL and FA respectively) under joint
communication and $(\varepsilon, \delta)$-differential privacy (DP)
constraints. We show that in order to achieve the optimal error under
$(\varepsilon, \delta)$-DP, it is sufficient for each client to send
$\Theta\left( n \min\left(\varepsilon, \varepsilon^2\right)\right)$ bits for FL
and $\Theta\left(\log\left( n\min\left(\varepsilon, \varepsilon^2\right)
\right)\right)$ bits for FA to the server, …
accuracy amplification analytics bits canonical client communication compression constraints delta differential privacy distributed error federated learning log major order privacy send server study trade under