all InfoSec news
GuardML: Efficient Privacy-Preserving Machine Learning Services Through Hybrid Homomorphic Encryption. (arXiv:2401.14840v1 [cs.LG])
cs.CR updates on arXiv.org arxiv.org
Machine Learning (ML) has emerged as one of data science's most
transformative and influential domains. However, the widespread adoption of ML
introduces privacy-related concerns owing to the increasing number of malicious
attacks targeting ML models. To address these concerns, Privacy-Preserving
Machine Learning (PPML) methods have been introduced to safeguard the privacy
and security of ML models. One such approach is the use of Homomorphic
Encryption (HE). However, the significant drawbacks and inefficiencies of
traditional HE render it impractical for highly …
address adoption arxiv attacks data data science domains encryption homomorphic encryption hybrid machine machine learning malicious ml models privacy science services targeting