all InfoSec news
Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table Transformers. (arXiv:2309.06526v1 [cs.LG])
cs.CR updates on arXiv.org arxiv.org
For machine learning with tabular data, Table Transformer (TabTransformer) is
a state-of-the-art neural network model, while Differential Privacy (DP) is an
essential component to ensure data privacy. In this paper, we explore the
benefits of combining these two aspects together in the scenario of transfer
learning -- differentially private pre-training and fine-tuning of
TabTransformers with a variety of parameter-efficient fine-tuning (PEFT)
methods, including Adapter, LoRA, and Prompt Tuning. Our extensive experiments
on the ACSIncome dataset show that these PEFT methods …
art benefits data data privacy differential privacy machine machine learning network neural network parameter privacy private scenario state training transfer transformers