all InfoSec news
Adversarial Attacks on Cooperative Multi-agent Bandits. (arXiv:2311.01698v1 [cs.LG])
cs.CR updates on arXiv.org arxiv.org
Cooperative multi-agent multi-armed bandits (CMA2B) consider the
collaborative efforts of multiple agents in a shared multi-armed bandit game.
We study latent vulnerabilities exposed by this collaboration and consider
adversarial attacks on a few agents with the goal of influencing the decisions
of the rest. More specifically, we study adversarial attacks on CMA2B in both
homogeneous settings, where agents operate with the same arm set, and
heterogeneous settings, where agents have distinct arm sets. In the homogeneous
setting, we propose attack …
adversarial adversarial attacks agent attacks bandit collaboration exposed game rest study vulnerabilities