all InfoSec news
One-bit Flip is All You Need: When Bit-flip Attack Meets Model Training. (arXiv:2308.07934v1 [cs.CR])
cs.CR updates on arXiv.org arxiv.org
Deep neural networks (DNNs) are widely deployed on real-world devices.
Concerns regarding their security have gained great attention from researchers.
Recently, a new weight modification attack called bit flip attack (BFA) was
proposed, which exploits memory fault inject techniques such as row hammer to
attack quantized models in the deployment stage. With only a few bit flips, the
target model can be rendered useless as a random guesser or even be implanted
with malicious functionalities. In this work, we seek …
attack attention called devices exploits flip great inject memory model training modification networks neural networks researchers security techniques training world