These tools and metrics are designed to help AI actors develop and use trustworthy AI systems and applications that respect human rights and are fair, transparent, explainable, robust, secure and safe.
Fawkes
Fawkes
Fawkes is a privacy protection system developed by researchers at SANDLab,
University of Chicago. For more information about the project, please refer to our
project webpage. Contact us at fawkes-team@googlegroups.com.
We published an academic paper to summarize our
work ‘Fawkes: Protecting Personal Privacy against Unauthorized Deep Learning Models‘
at USENIX Security 2020.
Copyright
This code is intended only for personal privacy protection or academic research.
Usage
$ fawkes
Options:
-
-m
,--mode
: the tradeoff between privacy and perturbation size. Select fromlow
,mid
,high
. The
higher the mode is, the more perturbation will add to the image and provide stronger protection. -
-d
,--directory
: the directory with images to run protection. -
-g
,--gpu
: the GPU id when using GPU for optimization. -
--batch-size
: number of images to run optimization together. Change to >1 only if you have extremely powerful
compute power. -
--format
: format of the output image (png or jpg).
Example
fawkes -d ./imgs --mode low
or python3 protection.py -d ./imgs --mode low
Tips
- The perturbation generation takes ~60 seconds per image on a CPU machine, and it would be much faster on a GPU
machine. Usebatch-size=1
on CPU andbatch-size>1
on GPUs. - Run on GPU. The current Fawkes package and binary does not support GPU. To use GPU, you need to clone this repo, install
the required packages insetup.py
, and replace tensorflow with tensorflow-gpu. Then you can run Fawkes
bypython3 fawkes/protection.py [args]
.
How do I know my images are secure?
We are actively working on this. Python scripts that can test the protection effectiveness will be ready shortly.
Quick Installation
Install from PyPI:
pip install fawkes
If you don’t have root privilege, please try to install on user namespace: pip install --user fawkes
.
Academic Research Usage
For academic researchers, whether seeking to improve fawkes or to explore potential vunerability, please refer to the
following guide to test Fawkes.
To protect a class in a dataset, first move the label’s image to a separate location and run Fawkes. Please
use --debug
option and set batch-size
to a reasonable number (i.e 16, 32). If the images are already cropped and
aligned, then also use the no-align
option.
Citation
@inproceedings{shan2020fawkes,
title={Fawkes: Protecting Personal Privacy against Unauthorized Deep Learning Models},
author={Shan, Shawn and Wenger, Emily and Zhang, Jiayun and Li, Huiying and Zheng, Haitao and Zhao, Ben Y},
booktitle={Proc. of {USENIX} Security},
year={2020}
}
About the tool
You can click on the links to see the associated tools
Tool type(s):
Objective(s):
Purpose(s):
Country of origin:
Type of approach:
Usage rights:
Programming languages:
Github stars:
- 4488
Github forks:
- 443
Use Cases
Would you like to submit a use case for this tool?
If you have used this tool, we would love to know more about your experience.
Add use case