all InfoSec news
Does Federated Learning Really Need Backpropagation?. (arXiv:2301.12195v2 [cs.LG] UPDATED)
cs.CR updates on arXiv.org arxiv.org
Federated learning (FL) is a general principle for decentralized clients to
train a server model collectively without sharing local data. FL is a promising
framework with practical applications, but its standard training paradigm
requires the clients to backpropagate through the model to compute gradients.
Since these clients are typically edge devices and not fully trusted, executing
backpropagation on them incurs computational and storage overhead as well as
white-box vulnerability. In light of this, we develop backpropagation-free
federated learning, dubbed BAFFLE, …
applications clients compute data decentralized devices edge edge devices federated learning framework general local paradigm server sharing standard train training