all InfoSec news
Near Optimal Private and Robust Linear Regression. (arXiv:2301.13273v1 [cs.LG])
cs.CR updates on arXiv.org arxiv.org
We study the canonical statistical estimation problem of linear regression
from $n$ i.i.d.~examples under $(\varepsilon,\delta)$-differential privacy when
some response variables are adversarially corrupted. We propose a variant of
the popular differentially private stochastic gradient descent (DP-SGD)
algorithm with two innovations: a full-batch gradient descent to improve sample
complexity and a novel adaptive clipping to guarantee robustness. When there is
no adversarial corruption, this algorithm improves upon the existing
state-of-the-art approach and achieves a near optimal sample complexity. Under
label-corruption, this …
adversarial algorithm art batch canonical complexity corruption delta differential privacy guarantee near novel popular privacy private problem response robustness state study under