Oct. 7, 2022, 1:20 a.m. | Jiawei Shao, Yuchang Sun, Songze Li, Jun Zhang

cs.CR updates on arXiv.org arxiv.org

Federated learning (FL) strives to enable collaborative training of machine
learning models without centrally collecting clients' private data. Different
from centralized training, the local datasets across clients in FL are
non-independent and identically distributed (non-IID). In addition, the
data-owning clients may drop out of the training process arbitrarily. These
characteristics will significantly degrade the training performance. This paper
proposes a Dropout-Resilient Secure Federated Learning (DReS-FL) framework
based on Lagrange coded computing (LCC) to tackle both the non-IID and dropout
problems. …

clients data data sharing federated learning non secret sharing

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Information Security Engineers

@ D. E. Shaw Research | New York City

Security Solution Architect

@ Civica | London, England, United Kingdom

Information Security Officer (80-100%)

@ SIX Group | Zurich, CH

Cloud Information Systems Security Engineer

@ Analytic Solutions Group | Chantilly, Virginia, United States

SRE Engineer & Security Software Administrator

@ Talan | Mexico City, Spain