all InfoSec news
A Game-Theoretic Analysis of Auditing Differentially Private Algorithms with Epistemically Disparate Herd
April 26, 2024, 4:10 a.m. | Ya-Ting Yang, Tao Zhang, Quanyan Zhu
cs.CR updates on arXiv.org arxiv.org
Abstract: Privacy-preserving AI algorithms are widely adopted in various domains, but the lack of transparency might pose accountability issues. While auditing algorithms can address this issue, machine-based audit approaches are often costly and time-consuming. Herd audit, on the other hand, offers an alternative solution by harnessing collective intelligence. Nevertheless, the presence of epistemic disparity among auditors, resulting in varying levels of expertise and access to knowledge, may impact audit performance. An effective herd audit will establish …
accountability address ai algorithms algorithms analysis arxiv audit auditing can consuming cs.cr cs.gt domains game issue machine privacy private solution transparency
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
IT Security Engineer
@ Timocom GmbH | Erkrath, Germany
Consultant SOC / CERT H/F
@ Hifield | Sèvres, France
Privacy Engineer, Implementation Review
@ Meta | Menlo Park, CA | Seattle, WA
Cybersecurity Specialist (Security Engineering)
@ Triton AI Pte Ltd | Singapore, Singapore, Singapore
SOC Analyst
@ Rubrik | Palo Alto
Consultant Tech Advisory H/F
@ Hifield | Sèvres, France