Feb. 5, 2024, 8:10 p.m. | Robert A. Bridges Vandy J. Tombs Christopher B. Stanley

cs.CR updates on arXiv.org arxiv.org

The state of the art and de facto standard for differentially private machine learning (ML) is differentially private stochastic gradient descent (DPSGD). Yet, the method is inherently wasteful. By adding noise to every gradient, it diminishes the overall privacy with every gradient step. Despite 15 years of fruitful research advancing the composition theorems, sub-sampling methods, and implementation techniques, adequate accuracy and privacy is often unattainable with current private ML methods. Meanwhile, the Exponential Mechanism (ExpM), designed for private optimization, has …

accuracy art cs.ai cs.cr cs.lg key machine machine learning math.pr mechanism noise path privacy private standard state stat.ml the key

Director of IT & Information Security

@ Outside | Boulder, CO

Information Security Governance Manager

@ Informa Group Plc. | London, United Kingdom

Senior Risk Analyst - Application Security (Remote, United States)

@ Dynatrace | Waltham, MA, United States

Security Software Engineer (Starshield) - Top Secret Clearance

@ SpaceX | Washington, DC

Network & Security Specialist (IT24055)

@ TMEIC | Roanoke, Virginia, United States

Senior Security Engineer - Application Security (F/M/N)

@ Swile | Paris, France