Jan. 5, 2024, 2:10 a.m. | Zhen Xiang, Zidi Xiong, Bo Li

cs.CR updates on arXiv.org arxiv.org

Backdoor attack is a common threat to deep neural networks. During testing,
samples embedded with a backdoor trigger will be misclassified as an
adversarial target by a backdoored model, while samples without the backdoor
trigger will be correctly classified. In this paper, we present the first
certified backdoor detector (CBD), which is based on a novel, adjustable
conformal prediction scheme based on our proposed statistic local dominant
probability. For any classifier under inspection, CBD provides 1) a detection
inference, 2) …

adversarial attack backdoor certified classified detector embedded local networks neural networks target testing threat trigger

IT Security Engineer

@ Timocom GmbH | Erkrath, Germany

Consultant SOC / CERT H/F

@ Hifield | Sèvres, France

Privacy Engineer, Implementation Review

@ Meta | Menlo Park, CA | Seattle, WA

Cybersecurity Specialist (Security Engineering)

@ Triton AI Pte Ltd | Singapore, Singapore, Singapore

SOC Analyst

@ Rubrik | Palo Alto

Consultant Tech Advisory H/F

@ Hifield | Sèvres, France