May 29, 2023, 1:10 a.m. | Haozhe Feng, Tianyu Pang, Chao Du, Wei Chen, Shuicheng Yan, Min Lin

cs.CR updates on arXiv.org arxiv.org

Federated learning (FL) is a general principle for decentralized clients to
train a server model collectively without sharing local data. FL is a promising
framework with practical applications, but its standard training paradigm
requires the clients to backpropagate through the model to compute gradients.
Since these clients are typically edge devices and not fully trusted, executing
backpropagation on them incurs computational and storage overhead as well as
white-box vulnerability. In light of this, we develop backpropagation-free
federated learning, dubbed BAFFLE, …

applications clients compute data decentralized devices edge edge devices federated learning framework general local paradigm server sharing standard train training

CyberSOC Technical Lead

@ Integrity360 | Sandyford, Dublin, Ireland

Cyber Security Strategy Consultant

@ Capco | New York City

Cyber Security Senior Consultant

@ Capco | Chicago, IL

Sr. Product Manager

@ MixMode | Remote, US

Security Compliance Strategist

@ Grab | Petaling Jaya, Malaysia

Cloud Security Architect, Lead

@ Booz Allen Hamilton | USA, VA, McLean (1500 Tysons McLean Dr)