all InfoSec news
MPCViT: Searching for Accurate and Efficient MPC-Friendly Vision Transformer with Heterogeneous Attention. (arXiv:2211.13955v2 [cs.CR] UPDATED)
cs.CR updates on arXiv.org arxiv.org
Secure multi-party computation (MPC) enables computation directly on
encrypted data and protects both data and model privacy in deep learning
inference. However, existing neural network architectures, including Vision
Transformers (ViTs), are not designed or optimized for MPC and incur
significant latency overhead. We observe Softmax accounts for the major latency
bottleneck due to a high communication complexity, but can be selectively
replaced or linearized without compromising the model accuracy. Hence, in this
paper, we propose an MPC-friendly ViT, dubbed MPCViT, …
accounts attention communication complexity computation data deep learning enable encrypted encrypted data high latency major mpc network neural network party privacy transformers