Aug. 7, 2023, 1:10 a.m. | Yisong Xiao, Aishan Liu, Tianyuan Zhang, Haotong Qin, Jinyang Guo, Xianglong Liu

cs.CR updates on arXiv.org arxiv.org

Quantization has emerged as an essential technique for deploying deep neural
networks (DNNs) on devices with limited resources. However, quantized models
exhibit vulnerabilities when exposed to various noises in real-world
applications. Despite the importance of evaluating the impact of quantization
on robustness, existing research on this topic is limited and often disregards
established principles of robustness evaluation, resulting in incomplete and
inconclusive findings. To address this gap, we thoroughly evaluated the
robustness of quantized models against various noises (adversarial attacks, …

applications benchmarking devices exposed impact networks neural networks research resources robustness topic vulnerabilities world

Enterprise Threat Intel Analyst

@ Resource Management Concepts, Inc. | Quantico, Virginia, United States

IT Security Engineer III

@ Mitsubishi Heavy Industries | Houston, TX, US, 77046

Cyber Intelligence Vice President, Threat Intelligence

@ JPMorgan Chase & Co. | Singapore, Singapore

Assistant Manager, Digital Forensics

@ Interpath Advisory | Manchester, England, United Kingdom

Tier 3 - Forensic Analyst, SME

@ Resource Management Concepts, Inc. | Quantico, Virginia, United States

Incident Response, SME

@ Resource Management Concepts, Inc. | Quantico, Virginia, United States