TY - GEN
T1 - Accessorize in the Dark
T2 - 28th European Symposium on Research in Computer Security, ESORICS 2023
AU - Cohen, Amit
AU - Sharif, Mahmood
N1 - Publisher Copyright:
© 2024, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2024
Y1 - 2024
N2 - Prior work showed that face-recognition systems ingesting RGB images captured via visible-light (VIS ) cameras are susceptible to real-world evasion attacks. Face-recognition systems in near-infrared (NIR ) are widely deployed for critical tasks (e.g., access control), and are hypothesized to be more secure due to the lower variability and dimensionality of NIR images compared to VIS ones. However, the actual robustness of NIR -based face recognition remains unknown. This work puts the hypothesis to the test by offering attacks well-suited for NIR -based face recognition and adapting them to facilitate physical realizability. The outcome of the attack is an adversarial accessory the adversary can wear to mislead NIR -based face-recognition systems. We tested the attack against six models, both defended and undefended, with varied numbers of subjects in the digital and physical domains. We found that face recognition in NIR is highly susceptible to real-world attacks. For example, ≥ 96.66% of physically realized attack attempts seeking arbitrary misclassification succeeded, including against defended models. Overall, our work highlights the need to defend NIR -based face recognition, especially when deployed in high-stakes domains.
AB - Prior work showed that face-recognition systems ingesting RGB images captured via visible-light (VIS ) cameras are susceptible to real-world evasion attacks. Face-recognition systems in near-infrared (NIR ) are widely deployed for critical tasks (e.g., access control), and are hypothesized to be more secure due to the lower variability and dimensionality of NIR images compared to VIS ones. However, the actual robustness of NIR -based face recognition remains unknown. This work puts the hypothesis to the test by offering attacks well-suited for NIR -based face recognition and adapting them to facilitate physical realizability. The outcome of the attack is an adversarial accessory the adversary can wear to mislead NIR -based face-recognition systems. We tested the attack against six models, both defended and undefended, with varied numbers of subjects in the digital and physical domains. We found that face recognition in NIR is highly susceptible to real-world attacks. For example, ≥ 96.66% of physically realized attack attempts seeking arbitrary misclassification succeeded, including against defended models. Overall, our work highlights the need to defend NIR -based face recognition, especially when deployed in high-stakes domains.
UR - http://www.scopus.com/inward/record.url?scp=85184081708&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-51479-1_3
DO - 10.1007/978-3-031-51479-1_3
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:85184081708
SN - 9783031514784
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 43
EP - 61
BT - Computer Security – ESORICS 2023 - 28th European Symposium on Research in Computer Security, 2023, Proceedings
A2 - Tsudik, Gene
A2 - Conti, Mauro
A2 - Liang, Kaitai
A2 - Smaragdakis, Georgios
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 25 September 2023 through 29 September 2023
ER -