Feb. 5, 2024, 8:10 p.m. | Robert A. Bridges Vandy J. Tombs Christopher B. Stanley

cs.CR updates on arXiv.org arxiv.org

The state of the art and de facto standard for differentially private machine learning (ML) is differentially private stochastic gradient descent (DPSGD). Yet, the method is inherently wasteful. By adding noise to every gradient, it diminishes the overall privacy with every gradient step. Despite 15 years of fruitful research advancing the composition theorems, sub-sampling methods, and implementation techniques, adequate accuracy and privacy is often unattainable with current private ML methods. Meanwhile, the Exponential Mechanism (ExpM), designed for private optimization, has …

accuracy art cs.ai cs.cr cs.lg key machine machine learning math.pr mechanism noise path privacy private standard state stat.ml the key

DevSecOps Engineer

@ Material Bank | Remote

Instrumentation & Control Engineer - Cyber Security

@ ASSYSTEM | Bridgwater, United Kingdom

Security Consultant

@ Tenable | MD - Columbia - Headquarters

Management Consultant - Cybersecurity - Internship

@ Wavestone | Hong Kong, Hong Kong

TRANSCOM IGC - Cybersecurity Engineer

@ IT Partners, Inc | St. Louis, Missouri, United States

Manager, Security Operations Engineering (EMEA)

@ GitLab | Remote, EMEA