March 8, 2024, 5:11 a.m. | Keyi Ju, Xiaoqi Qin, Hui Zhong, Xinyue Zhang, Miao Pan, Baoling Liu

cs.CR updates on arXiv.org arxiv.org

arXiv:2312.11126v2 Announce Type: replace-cross
Abstract: Quantum computing revolutionizes the way of solving complex problems and handling vast datasets, which shows great potential to accelerate the machine learning process. However, data leakage in quantum machine learning (QML) may present privacy risks. Although differential privacy (DP), which protects privacy through the injection of artificial noise, is a well-established approach, its application in the QML domain remains under-explored. In this paper, we propose to harness inherent quantum noises to protect data privacy in …

accelerate arxiv computing cs.cr cs.lg data data leakage datasets differential privacy great handling injection machine machine learning may preservation privacy privacy risks problems process qml quant-ph quantum quantum computing risks vast

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Security Engineer

@ Commit | San Francisco

Trainee (m/w/d) Security Engineering CTO Taskforce Team

@ CHECK24 | Berlin, Germany

Security Engineer

@ EY | Nicosia, CY, 1087

Information System Security Officer (ISSO) Level 3-COMM Job#455

@ Allen Integrated Solutions | Chantilly, Virginia, United States

Application Security Engineer

@ Wise | London, United Kingdom