Dec. 6, 2022, 2:10 a.m. | Samiul Alam, Luyang Liu, Ming Yan, Mi Zhang

cs.CR updates on arXiv.org arxiv.org

Most cross-device federated learning (FL) studies focus on the
model-homogeneous setting where the global server model and local client models
are identical. However, such constraint not only excludes low-end clients who
would otherwise make unique contributions to model training but also restrains
clients from training large models due to on-device resource bottlenecks. In
this work, we propose FedRolex, a partial training (PT)-based approach that
enables model-heterogeneous FL and can train a global server model larger than
the largest client model. …

federated learning

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Security Engineer 2

@ Oracle | BENGALURU, KARNATAKA, India

Oracle EBS DevSecOps Developer

@ Accenture Federal Services | Arlington, VA

Information Security GRC Specialist - Risk Program Lead

@ Western Digital | Irvine, CA, United States

Senior Cyber Operations Planner (15.09)

@ OCT Consulting, LLC | Washington, District of Columbia, United States

AI Cybersecurity Architect

@ FactSet | India, Hyderabad, DVS, SEZ-1 – Orion B4; FL 7,8,9,11 (Hyderabad - Divyasree 3)