Skip to yearly menu bar Skip to main content


Poster

Revisiting Adversarial Patches for Designing Camera-Agnostic Attacks against Person Detection

Hui Wei · Zhixiang Wang · Kewei Zhang · Jiaqi Hou · Yuanwei Liu · Hao Tang · Zheng Wang


Abstract:

Physical adversarial attacks can deceive deep neural networks (DNNs), leading to erroneous predictions in real-world scenarios. To uncover potential security risks, attacking the safety-critical task of person detection has garnered significant attention. However, we observe that existing attack methods overlook the pivotal role of the camera in the physical adversarial attack workflow, involving capturing real-world scenes and converting them into digital images. This oversight leads to instability and challenges in reproducing these attacks. In this work, we revisit patch-based attacks against person detectors and introduce a camera-agnostic physical adversarial attack to mitigate this limitation. Specifically, we construct a differentiable camera Image Signal Processing (ISP) proxy network to compensate for physical-to-digital domain transition gap. Furthermore, the camera ISP proxy network serves as a defense module, forming an adversarial optimization framework with the attack module. The attack module optimizes adversarial perturbations to maximize effectiveness, while the defense module optimizes the conditional parameters of the camera ISP proxy network to minimize attack effectiveness. These modules engage in an adversarial game, enhancing cross-camera stability. Experimental results demonstrate that our proposed CAP (Camera-Agnostic Patch) attack effectively conceals persons from detectors across various imaging hardware, including two distinct cameras and four smartphones. The source code will be released upon paper acceptance.

Live content is unavailable. Log in and register to view live content