all InfoSec news
Physical-World Optical Adversarial Attacks on 3D Face Recognition. (arXiv:2205.13412v3 [cs.CV] UPDATED)
Nov. 15, 2022, 2:20 a.m. | Yanjie Li, Yiquan Li, Xuelong Dai, Songtao Guo, Bin Xiao
cs.CR updates on arXiv.org arxiv.org
2D face recognition has been proven insecure for physical adversarial
attacks. However, few studies have investigated the possibility of attacking
real-world 3D face recognition systems. 3D-printed attacks recently proposed
cannot generate adversarial points in the air. In this paper, we attack 3D face
recognition systems through elaborate optical noises. We took structured light
3D scanners as our attack target. End-to-end attack algorithms are designed to
generate adversarial illumination for 3D faces through the inherent or an
additional projector to produce …
adversarial attacks face recognition physical recognition world
More from arxiv.org / cs.CR updates on arXiv.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Cloud Technical Solutions Engineer, Security
@ Google | Mexico City, CDMX, Mexico
Assoc Eng Equipment Engineering
@ GlobalFoundries | SGP - Woodlands
Staff Security Engineer, Cloud Infrastructure
@ Flexport | Bellevue, WA; San Francisco, CA
Software Engineer III, Google Cloud Security and Privacy
@ Google | Sunnyvale, CA, USA
Software Engineering Manager II, Infrastructure, Google Cloud Security and Privacy
@ Google | San Francisco, CA, USA; Sunnyvale, CA, USA