Web: http://arxiv.org/abs/2201.01989

Jan. 7, 2022, 2:20 a.m. | Minghui Xu, Zongrui Zou, Ye Cheng, Qin Hu, Dongxiao Yu, Xiuzhen Cheng

cs.CR updates on arXiv.org arxiv.org

Decentralized learning involves training machine learning models over remote
mobile devices, edge servers, or cloud servers while keeping data localized.
Even though many studies have shown the feasibility of preserving privacy,
enhancing training performance or introducing Byzantine resilience, but none of
them simultaneously considers all of them. Therefore we face the following
problem: \textit{how can we efficiently coordinate the decentralized learning
process while simultaneously maintaining learning security and data privacy?}
To address this issue, in this paper we propose SPDL, …

blockchain learning privacy

More from arxiv.org / cs.CR updates on arXiv.org

Head of Information Security

@ Canny | Remote

Information Technology Specialist (INFOSEC)

@ U.S. Securities & Exchange Commission | Washington, D.C.

Information Security Manager - $90K-$180K - MANAG002176

@ Sound Transit | Seattle, WA

Sr. Software Security Architect

@ SAS | Remote

Senior Incident Responder

@ CipherTechs, Inc. | Remote

Data Security DevOps Engineer Senior/Intermediate

@ University of Michigan - ITS | Ann Arbor, MI