all InfoSec news
Improving Privacy-Preserving Vertical Federated Learning by Efficient Communication with ADMM
April 9, 2024, 4:11 a.m. | Chulin Xie, Pin-Yu Chen, Qinbin Li, Arash Nourian, Ce Zhang, Bo Li
cs.CR updates on arXiv.org arxiv.org
Abstract: Federated learning (FL) enables distributed resource-constrained devices to jointly train shared models while keeping the training data local for privacy purposes. Vertical FL (VFL), which allows each client to collect partial features, has attracted intensive research efforts recently. We identified the main challenges that existing VFL frameworks are facing: the server needs to communicate gradients with the clients for each training step, incurring high communication cost that leads to rapid consumption of privacy budgets. To …
arxiv challenges client collect communication cs.cr cs.lg data devices distributed features federated federated learning local main partial privacy research resource train training training data
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
Social Engineer For Reverse Engineering Exploit Study
@ Independent study | Remote
Data Privacy Manager m/f/d)
@ Coloplast | Hamburg, HH, DE
Cybersecurity Sr. Manager
@ Eastman | Kingsport, TN, US, 37660
KDN IAM Associate Consultant
@ KPMG India | Hyderabad, Telangana, India
Learning Experience Designer in Cybersecurity (f/m/div.) (Salary: ~113.000 EUR p.a.*)
@ Bosch Group | Stuttgart, Germany
Senior Security Engineer - SIEM
@ Samsara | Remote - US