all InfoSec news
Secure Floating-Point Training
April 1, 2023, 1:12 p.m. |
IACR News www.iacr.org
ePrint Report: Secure Floating-Point Training
Deevashwer Rathee, Anwesh Bhattacharya, Divya Gupta, Rahul Sharma, Dawn Song
Secure 2-party computation (2PC) of floating-point arithmetic is improving in performance and recent work runs deep learning algorithms with it, while being as numerically precise as commonly used machine learning (ML) frameworks like PyTorch. We find that the existing 2PC libraries for floating-point support generic computations and lack specialized support for ML training. Hence, their latency and communication costs for compound operations (e.g., dot products) …
algorithms communication compound computation dawn song deep learning dot eprint report find frameworks high latency machine machine learning novel operations party performance point products protocols prove pytorch report song support training work
More from www.iacr.org / IACR News
Post-Doc in Lattice-Based Cryptography
1 day, 18 hours ago |
www.iacr.org
WPEC 2024: NIST Workshop on Privacy Enhancing Cryptography
1 day, 18 hours ago |
www.iacr.org
Indistinguishability Obfuscation from Bilinear Maps and LPN Variants
1 day, 23 hours ago |
www.iacr.org
Securing the Future of GenAI: Policy and Technology
1 day, 23 hours ago |
www.iacr.org
Jobs in InfoSec / Cybersecurity
CyberSOC Technical Lead
@ Integrity360 | Sandyford, Dublin, Ireland
Cyber Security Strategy Consultant
@ Capco | New York City
Cyber Security Senior Consultant
@ Capco | Chicago, IL
Sr. Product Manager
@ MixMode | Remote, US
Corporate Intern - Information Security (Year Round)
@ Associated Bank | US WI Remote
Senior Offensive Security Engineer
@ CoStar Group | US-DC Washington, DC