all InfoSec news
Degree-Preserving Randomized Response for Graph Neural Networks under Local Differential Privacy. (arXiv:2202.10209v2 [cs.CR] UPDATED)
Oct. 10, 2022, 1:20 a.m. | Seira Hidano, Takao Murakami
cs.CR updates on arXiv.org arxiv.org
Differentially private GNNs (Graph Neural Networks) have been recently
studied to provide high accuracy in various tasks on graph data while strongly
protecting user privacy. In particular, a recent study proposes an algorithm to
protect each user's feature vector in an attributed graph with LDP (Local
Differential Privacy), a strong privacy notion without a trusted third party.
However, this algorithm does not protect edges (friendships) in a social graph
or protect user privacy in unattributed graphs. How to strongly protect …
differential privacy local networks neural networks privacy response under
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Information Security Engineers
@ D. E. Shaw Research | New York City
Cloud Security Engineer
@ Pacific Gas and Electric Company | Oakland, CA, US, 94612
Penetration Tester (Level 2)
@ Verve Group | Pune, Mahārāshtra, India
Senior Security Operations Engineer (Azure)
@ Jamf | US Remote
(Junior) Cyber Security Consultant IAM (m/w/d)
@ Atos | Berlin, DE, D-13353