all InfoSec news
Individualized PATE: Differentially Private Machine Learning with Individual Privacy Guarantees. (arXiv:2202.10517v4 [cs.LG] UPDATED)
Nov. 9, 2022, 2:20 a.m. | Franziska Boenisch, Christopher Mühl, Roy Rinberg, Jannis Ihrig, Adam Dziedzic
cs.CR updates on arXiv.org arxiv.org
Applying machine learning (ML) to sensitive domains requires privacy
protection of the underlying training data through formal privacy frameworks,
such as differential privacy (DP). Yet, usually, the privacy of the training
data comes at the cost of the resulting ML models' utility. One reason for this
is that DP uses one uniform privacy budget epsilon for all training data
points, which has to align with the strictest privacy requirement encountered
among all data holders. In practice, different data holders have …
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Information Security Engineers
@ D. E. Shaw Research | New York City
Dir-Information Security - Cyber Analytics
@ Marriott International | Bethesda, MD, United States
Security Engineer - Security Operations
@ TravelPerk | Barcelona, Barcelona, Spain
Information Security Mgmt- Risk Assessor
@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India
SAP CO Consultant
@ Atos | Istanbul, TR