all InfoSec news
Amplitude-Varying Perturbation for Balancing Privacy and Utility in Federated Learning. (arXiv:2303.04274v1 [cs.LG])
cs.CR updates on arXiv.org arxiv.org
While preserving the privacy of federated learning (FL), differential privacy
(DP) inevitably degrades the utility (i.e., accuracy) of FL due to model
perturbations caused by DP noise added to model updates. Existing studies have
considered exclusively noise with persistent root-mean-square amplitude and
overlooked an opportunity of adjusting the amplitudes to alleviate the adverse
effects of the noise. This paper presents a new DP perturbation mechanism with
a time-varying noise amplitude to protect the privacy of FL and retain the
capability …
accuracy differential privacy federated learning noise opportunity persistent privacy root square studies updates utility