all InfoSec news
SECO: Secure Inference With Model Splitting Across Multi-Server Hierarchy
April 26, 2024, 4:11 a.m. | Shuangyi Chen, Ashish Khisti
cs.CR updates on arXiv.org arxiv.org
Abstract: In the context of prediction-as-a-service, concerns about the privacy of the data and the model have been brought up and tackled via secure inference protocols. These protocols are built up by using single or multiple cryptographic tools designed under a variety of different security assumptions.
In this paper, we introduce SECO, a secure inference protocol that enables a user holding an input data vector and multiple server nodes deployed with a split neural network model …
arxiv as-a-service context cryptographic cs.cr cs.dc data hierarchy prediction privacy protocols security server service single tools under
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
Head of Security Operations
@ Canonical Ltd. | Home based - Americas, EMEA
Security Specialist
@ Lely | Maassluis, Netherlands
Senior Cyber Incident Response (Hybrid)
@ SmartDev | Cầu Giấy, Vietnam
Sr Security Engineer - Colombia
@ Nubank | Colombia, Bogota
Security Engineer, Investigations - i3
@ Meta | Menlo Park, CA | Washington, DC | Remote, US
Cyber Security Engineer
@ ASSYSTEM | Bridgwater, United Kingdom