all InfoSec news
Split Learning without Local Weight Sharing to Enhance Client-side Data Privacy. (arXiv:2212.00250v1 [cs.CR])
Dec. 2, 2022, 2:10 a.m. | Ngoc Duy Pham, Tran Khoa Phan, Alsharif Abuadbba, Doan Nguyen, Naveen Chilamkurti
cs.CR updates on arXiv.org arxiv.org
Split learning (SL) aims to protect user data privacy by splitting deep
models between client-server and keeping private data locally. SL has been
demonstrated to achieve similar accuracy as the centralized learning model. In
SL with multiple clients, the local training weights are shared between clients
for local model aggregation. This paper investigates the potential of data
leakage due to local weight sharing among the clients in SL by performing model
inversion attacks. To mitigate the identified leakage issue, we …
client client-side data data privacy local privacy sharing split learning
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Security Officer Hospital - Major Hospital Account - Full-Time - Healthcare Security
@ Allied Universal | Anaheim, CA, United States
Product Security Lead
@ Lely | Maassluis, Netherlands
Summer Associate, IT Information Security (Temporary)
@ Vir Biotechnology, Inc. | San Francisco, California, United States
Director, Governance, Risk and Compliance - Corporate
@ Ryan Specialty | Chicago, IL, US, 60606
Cybersecurity Governance, Risk, and Compliance Engineer
@ Emerson | Shakopee, MN, United States