Dec. 15, 2023, 1:36 a.m. |

IACR News www.iacr.org

ePrint Report: Regularized PolyKervNets: Optimizing Expressiveness and Efficiency for Private Inference in Deep Neural Networks

Toluwani Aremu


Private computation of nonlinear functions, such as Rectified Linear Units (ReLUs) and max-pooling operations, in deep neural networks (DNNs) poses significant challenges in terms of storage, bandwidth, and time consumption. To address these challenges, there has been a growing interest in utilizing privacy-preserving techniques that leverage polynomial activation functions and kernelized convolutions as alternatives to traditional ReLUs. However, these alternative approaches often suffer …

bandwidth challenges computation efficiency eprint report functions linear networks neural networks operations private report storage terms

Junior Cybersecurity Analyst - 3346195

@ TCG | 725 17th St NW, Washington, DC, USA

Cyber Intelligence, Senior Advisor

@ Peraton | Chantilly, VA, United States

Consultant Cybersécurité H/F - Innovative Tech

@ Devoteam | Marseille, France

Manager, Internal Audit (GIA Cyber)

@ Standard Bank Group | Johannesburg, South Africa

Staff DevSecOps Engineer

@ Raft | San Antonio, TX (Local Remote)

Domain Leader Cybersecurity

@ Alstom | Bengaluru, KA, IN