Sept. 14, 2023, 1:10 a.m. | Xilong Wang, Chia-Mu Yu, Pin-Yu Chen

cs.CR updates on arXiv.org arxiv.org

For machine learning with tabular data, Table Transformer (TabTransformer) is
a state-of-the-art neural network model, while Differential Privacy (DP) is an
essential component to ensure data privacy. In this paper, we explore the
benefits of combining these two aspects together in the scenario of transfer
learning -- differentially private pre-training and fine-tuning of
TabTransformers with a variety of parameter-efficient fine-tuning (PEFT)
methods, including Adapter, LoRA, and Prompt Tuning. Our extensive experiments
on the ACSIncome dataset show that these PEFT methods …

art benefits data data privacy differential privacy machine machine learning network neural network parameter privacy private scenario state training transfer transformers

Cryptography Software Developer

@ Intel | USA - AZ - Chandler

Lead Consultant, Geology

@ WSP | Richmond, VA, United States

BISO Cybersecurity Director

@ ABM Industries | Alpharetta, GA, United States

TTECH Analista de ciberseguridad

@ Telefónica | LIMA, PE

TRANSCOM IGC - Cloud Security Engineer

@ IT Partners, Inc | St. Louis, Missouri, United States

Sr Cyber Threat Hunt Researcher

@ Peraton | Beltsville, MD, United States