Jan. 7, 2022, 2:20 a.m. | Minghui Xu, Zongrui Zou, Ye Cheng, Qin Hu, Dongxiao Yu, Xiuzhen Cheng

cs.CR updates on arXiv.org arxiv.org

Decentralized learning involves training machine learning models over remote
mobile devices, edge servers, or cloud servers while keeping data localized.
Even though many studies have shown the feasibility of preserving privacy,
enhancing training performance or introducing Byzantine resilience, but none of
them simultaneously considers all of them. Therefore we face the following
problem: \textit{how can we efficiently coordinate the decentralized learning
process while simultaneously maintaining learning security and data privacy?}
To address this issue, in this paper we propose SPDL, …

blockchain privacy

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Information Security Engineers

@ D. E. Shaw Research | New York City

Staff DFIR Investigator

@ SentinelOne | United States - Remote

Senior Consultant.e (H/F) - Product & Industrial Cybersecurity

@ Wavestone | Puteaux, France

Information Security Analyst

@ StarCompliance | York, United Kingdom, Hybrid

Senior Cyber Security Analyst (IAM)

@ New York Power Authority | White Plains, US