all InfoSec news
MPCFormer: fast, performant and private Transformer inference with MPC. (arXiv:2211.01452v1 [cs.LG])
Nov. 4, 2022, 1:20 a.m. | Dacheng Li, Rulin Shao, Hongyi Wang, Han Guo, Eric P. Xing, Hao Zhang
cs.CR updates on arXiv.org arxiv.org
Enabling private inference is crucial for many cloud inference services that
are based on Transformer models. However, existing private inference solutions
for Transformers can increase the inference latency by more than 60x or
significantly compromise the quality of inference results. In this paper, we
design the framework MPCFORMER using secure multi-party computation (MPC) and
Knowledge Distillation (KD). It can be used in tandem with many specifically
designed MPC-friendly approximations and trained Transformer models. MPCFORMER
significantly speeds up Transformer model inference …
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Senior Manager, Security Compliance (Customer Trust)
@ Box | Tokyo
Cyber Security Engineering Specialist
@ SITEC Consulting | St. Louis, MO, USA 63101
Technical Security Analyst
@ Spire Healthcare | United Kingdom
Embedded Threat Intelligence Team Account Manager
@ Sibylline Ltd | Austin, Texas, United States
Bank Protection Security Officer
@ Allied Universal | Portland, OR, United States