Aug. 3, 2023, 1:10 a.m. | Adel Javanmard, Vahab Mirrokni, Jean Pouget-Abadie

cs.CR updates on arXiv.org arxiv.org

Estimating causal effects from randomized experiments is only feasible if
participants agree to reveal their potentially sensitive responses. Of the many
ways of ensuring privacy, label differential privacy is a widely used measure
of an algorithm's privacy guarantee, which might encourage participants to
share responses without running the risk of de-anonymization. Many
differentially private mechanisms inject noise into the original data-set to
achieve this privacy guarantee, which increases the variance of most
statistical estimators and makes the precise measurement of …

algorithm de-anonymization differential privacy guarantee measure outcomes privacy private risk running share

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Professional Services Resident Consultant / Senior Professional Services Resident Consultant - AMS

@ Zscaler | Bengaluru, India

Head of Security, Risk & Compliance

@ Gedeon Richter Pharma GmbH | Budapest, HU

Unarmed Professional Security Officer - County Hospital

@ Allied Universal | Los Angeles, CA, United States

Senior Software Engineer, Privacy Engineering

@ Block | Seattle, WA, United States

Senior Cyber Security Specialist

@ Avaloq | Bioggio, Switzerland