Feb. 15, 2024, 5:10 a.m. | Prajwal Panzade, Daniel Takabi, Zhipeng Cai

cs.CR updates on arXiv.org arxiv.org

arXiv:2402.09059v1 Announce Type: cross
Abstract: In today's machine learning landscape, fine-tuning pretrained transformer models has emerged as an essential technique, particularly in scenarios where access to task-aligned training data is limited. However, challenges surface when data sharing encounters obstacles due to stringent privacy regulations or user apprehension regarding personal information disclosure. Earlier works based on secure multiparty computation (SMC) and fully homomorphic encryption (FHE) for privacy-preserving machine learning (PPML) focused more on privacy-preserving inference than privacy-preserving training. In response, we …

access arxiv can challenges cs.ai cs.cr cs.lg data data sharing encrypted encryption fine-tuning fully homomorphic encryption homomorphic encryption machine machine learning sharing task today training training data transformers

Sr. Cloud Security Engineer

@ BLOCKCHAINS | USA - Remote

Network Security (SDWAN: Velocloud) Infrastructure Lead

@ Sopra Steria | Noida, Uttar Pradesh, India

Senior Python Engineer, Cloud Security

@ Darktrace | Cambridge

Senior Security Consultant

@ Nokia | United States

Manager, Threat Operations

@ Ivanti | United States, Remote

Lead Cybersecurity Architect - Threat Modeling | AWS Cloud Security

@ JPMorgan Chase & Co. | Columbus, OH, United States