April 28, 2022, 1:20 a.m. | Zhenhuan Yang, Shu Hu, Yunwen Lei, Kush R. Varshney, Siwei Lyu, Yiming Ying

cs.CR updates on arXiv.org arxiv.org

Stochastic gradient descent ascent (SGDA) and its variants have been the
workhorse for solving minimax problems. However, in contrast to the
well-studied stochastic gradient descent (SGD) with differential privacy (DP)
constraints, there is little work on understanding the generalization (utility)
of SGDA with DP constraints. In this paper, we use the algorithmic stability
approach to establish the generalization (utility) of DP-SGDA in different
settings. In particular, for the convex-concave setting, we prove that the
DP-SGDA can achieve an optimal utility …

lg problems

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Information Security Engineers

@ D. E. Shaw Research | New York City

Technology Transfer GMP Compliance Officer

@ Pharmathen | Sapes, East Macedonia and Thrace, Greece

Security Cyber Consultant DRC (m/w/d)

@ Atos | Berlin, DE, D-13353

Penetration Tester - InfoSec

@ Rapid7 | NIS Belfast

Cyber Vulnerability Lead

@ Under Armour | Remote, US