June 14, 2024, 4:19 a.m. | Ossi R\"ais\"a, Stratis Markou, Matthew Ashman, Wessel P. Bruinsma, Marlon Tobaben, Antti Honkela, Richard E. Turner

cs.CR updates on arXiv.org arxiv.org

arXiv:2406.08569v1 Announce Type: cross
Abstract: Many high-stakes applications require machine learning models that protect user privacy and provide well-calibrated, accurate predictions. While Differential Privacy (DP) is the gold standard for protecting user privacy, standard DP mechanisms typically significantly impair performance. One approach to mitigating this issue is pre-training models on simulated data before DP learning on the private data. In this work we go a step further, using simulated data to train a meta-learning model that combines the Convolutional Conditional …

applications arxiv aware cs.cr cs.lg data differential privacy high issue machine machine learning machine learning models mechanisms meta noise performance predictions privacy private protect protecting standard stat.ml training user privacy

Information Technology Specialist I: Windows Engineer

@ Los Angeles County Employees Retirement Association (LACERA) | Pasadena, California

Information Technology Specialist I, LACERA: Information Security Engineer

@ Los Angeles County Employees Retirement Association (LACERA) | Pasadena, CA

Solutions Expert

@ General Dynamics Information Technology | USA MD Home Office (MDHOME)

Physical Security Specialist

@ The Aerospace Corporation | Chantilly

System Administrator

@ General Dynamics Information Technology | USA VA Newington - Customer Proprietary (VAC395)

Microsoft Exchange & 365 Systems Engineer - TS/SCI with Polygraph

@ General Dynamics Information Technology | USA VA Chantilly - 14700 Lee Rd (VAS100)