Feb. 5, 2024, 8:10 p.m. | Chanho Park Namyoon Lee

cs.CR updates on arXiv.org arxiv.org

Distributed learning is an effective approach to accelerate model training using multiple workers. However, substantial communication delays emerge between workers and a parameter server due to massive costs associated with communicating gradients. SignSGD with majority voting (signSGD-MV) is a simple yet effective optimizer that reduces communication costs through one-bit quantization, yet the convergence rates considerably decrease as adversarial workers increase. In this paper, we show that the convergence rate is invariant as the number of adversarial workers increases, provided that …

accelerate adversarial adversarial attacks attacks communication cs.cr cs.lg decoding defense distributed eess.sp federated model training parameter server sign simple training voting workers

Social Engineer For Reverse Engineering Exploit Study

@ Independent study | Remote

SITEC- Systems Security Administrator- Camp HM Smith

@ Peraton | Camp H.M. Smith, HI, United States

Cyberspace Intelligence Analyst

@ Peraton | Fort Meade, MD, United States

General Manager, Cybersecurity, Google Public Sector

@ Google | Virginia, USA; United States

Cyber Security Advisor

@ H&M Group | Stockholm, Sweden

Engineering Team Manager – Security Controls

@ H&M Group | Stockholm, Sweden