Based on the paper "Neural Compression Restoration Against Gradient-based Adversarial Attacks".
In this work the proposed defense strategy is evaluated against black-box attacks. In particular, we consider the Hop Skip Jump and the Square attack.
More details about the project in the paper or in the presentation.
To get a local copy up and running follow these simple steps:
- Clone the repo
git clone https://github.com/LorenzoAgnolucci/Adversarial_attacks_defense.git
-
Run
pip install -r requirements.txt
in the root folder of the repo to install the requirements -
Run
pip install -e adversarial-robustness-toolbox/
in the root folder to install the ART module with the custom files
-
Download the dataset
-
Change the path of the images and the parameters in
jpeg_gan_hop_skip_jump_pytorch.py
andjpeg_gan_square_pytorch.py
-
Run
jpeg_gan_hop_skip_jump_pytorch.py
orjpeg_gan_square_pytorch.py
to evaluate the defense strategy against the corresponding attack
Visual and Multimedia Recognition © Course held by Professor Alberto Del Bimbo - Computer Engineering Master Degree @University of Florence