Skip to content

Latest commit

 

History

History
16 lines (16 loc) · 773 Bytes

README.md

File metadata and controls

16 lines (16 loc) · 773 Bytes

Entropy-based Adversarial Attacks

This repository contains code to perform entropy-based localized adversarial attacks, code to setup and perform a study to investigate the perceptibility of adversarial perturbations, and the results of such a (small-scale) study. The method is based on the publication Adversarial attacks hidden in plain sight, which includes the results of the study. If anything in this repository helps you, please cite the paper:

@Misc{Gopfert2019AdversarialAttacksHidden,
  author     = {Jan Philip Göpfert and André Artelt and Heiko Wersing and Barbara Hammer},
  title      = {Adversarial attacks hidden in plain sight},
  date       = {2019},
  eprint     = {1902.09286},
  eprinttype = {arXiv},
}