April 26, 2024, 4:10 a.m. | Ya-Ting Yang, Tao Zhang, Quanyan Zhu

cs.CR updates on arXiv.org arxiv.org

arXiv:2404.16195v1 Announce Type: new
Abstract: Privacy-preserving AI algorithms are widely adopted in various domains, but the lack of transparency might pose accountability issues. While auditing algorithms can address this issue, machine-based audit approaches are often costly and time-consuming. Herd audit, on the other hand, offers an alternative solution by harnessing collective intelligence. Nevertheless, the presence of epistemic disparity among auditors, resulting in varying levels of expertise and access to knowledge, may impact audit performance. An effective herd audit will establish …

accountability address ai algorithms algorithms analysis arxiv audit auditing can consuming cs.cr cs.gt domains game issue machine privacy private solution transparency

IT Security Engineer

@ Timocom GmbH | Erkrath, Germany

Consultant SOC / CERT H/F

@ Hifield | Sèvres, France

Privacy Engineer, Implementation Review

@ Meta | Menlo Park, CA | Seattle, WA

Cybersecurity Specialist (Security Engineering)

@ Triton AI Pte Ltd | Singapore, Singapore, Singapore

SOC Analyst

@ Rubrik | Palo Alto

Consultant Tech Advisory H/F

@ Hifield | Sèvres, France