April 19, 2023, 1:10 a.m. | Sina Sajadmanesh, Daniel Gatica-Perez

cs.CR updates on arXiv.org arxiv.org

Graph Neural Networks (GNNs) have become a popular tool for learning on
graphs, but their widespread use raises privacy concerns as graph data can
contain personal or sensitive information. Differentially private GNN models
have been recently proposed to preserve privacy while still allowing for
effective learning over graph-structured datasets. However, achieving an ideal
balance between accuracy and privacy in GNNs remains challenging due to the
intrinsic structural connectivity of graphs. In this paper, we propose a new
differentially private GNN …

balance called connectivity data datasets differential privacy graphs information networks neural networks personal popular privacy private sensitive information tool training

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Information Security Engineers

@ D. E. Shaw Research | New York City

Ford Pro Tech and FCSD Tech – Product Manager, Cyber Security

@ Ford Motor Company | Chennai, Tamil Nadu, India

Cloud Data Encryption and Cryptography Automation Expert

@ Ford Motor Company | Chennai, Tamil Nadu, India

SecOps Analyst

@ Atheneum | Berlin, Berlin, Germany

Consulting Director, Cloud Security, Proactive Services (Unit 42)

@ Palo Alto Networks | Santa Clara, CA, United States