May 26, 2023, 1:18 a.m. | Zinan Lin, Sivakanth Gopi, Janardhan Kulkarni, Harsha Nori, Sergey Yekhanin

cs.CR updates on

Generating differentially private (DP) synthetic data that closely resembles
the original private data without leaking sensitive user information is a
scalable way to mitigate privacy concerns in the current data-driven world. In
contrast to current practices that train customized models for this task, we
aim to generate DP Synthetic Data via APIs (DPSDA), where we treat foundation
models as blackboxes and only utilize their inference APIs. Such API-based,
training-free approaches are easier to deploy as exemplified by the recent
surge …

aim apis current data data-driven foundation images information practices privacy privacy concerns private private data synthetic synthetic data task train world

More from / cs.CR updates on

Toronto Transit Commission (TTC) - Chief Information Security Officer (CISO)

@ BIPOC Executive Search Inc. | Toronto, Ontario, Canada

Unit Manager for Cyber Security Culture & Competence

@ H&M Group | Stockholm, Sweden

Junior Security Engineer

@ Pipedrive | Tallinn, Estonia

Splunk Engineer (TS/SCI)

@ GuidePoint Security LLC | Huntsville, AL

DevSecOps Engineer, SRE (Top Secret) - 1537

@ Reinventing Geospatial (RGi) | Herndon, VA

Governance, Risk and Compliance (GRC) Lead

@ Leidos | Brisbane, Australia