Catalogue of Tools & Metrics for Trustworthy AI

These tools and metrics are designed to help AI actors develop and use trustworthy AI systems and applications that respect human rights and are fair, transparent, explainable, robust, secure and safe.

Robust Out-of-distribution Detection in Neural Networks



Robust Out-of-distribution Detection in Neural Networks

This project is for the paper: Robust Out-of-distribution Detection in Neural Networks. Some codes are from ODIN, Outlier Exposure, and deep Mahalanobis detector.

Preliminaries

It is tested under Ubuntu Linux 16.04.1 and Python 3.6 environment, and requires some packages to be installed:

Downloading in-distribution Dataset

  • CIFAR: included in PyTorch.
  • GTSRB: we provide scripts to download it.

Downloading out-of-distribution Datasets

Overview of the Code

Running Experiments

  • For the SVHN dataset, you can run select_svhn_data.py to select test data.
  • For the GTSRB dataset, you can run prepare_data.sh to get the dataset.
  • robust_ood_train.py: the script used to train different models.
  • eval.py: the script used to evaluate classification accuracy and robustness of models.
  • eval_ood_detection.py: the script used to evaluate the OOD detection performance of models.

Example

For CIFAR-10 experiments, you can run the following commands on the CIFAR directory to get results.

  • train an ALOE model:

python robust_ood_train.py --name ALOE --adv --ood

  • train an AOE model:

python robust_ood_train.py --name AOE --adv --adv-only-in --ood

  • train an ADV model:

python robust_ood_train.py --name ADV --adv

  • train an OE model:

python robust_ood_train.py --name OE --ood

  • train an Original model:

python robust_ood_train.py --name Original

  • Evaluate the classification performance of the ALOE model:

python eval.py --name ALOE --adv

  • Evaluate the traditional OOD detection performance of MSP and ODIN using the ALOE model:

python eval_ood_detection.py --name ALOE --method msp_and_odin

  • Evaluate the robust OOD detection performance of MSP and ODIN using the ALOE model:

python eval_ood_detection.py --name ALOE --method msp_and_odin --adv

  • Evaluate the traditional OOD detection performance of Mahalanobis using the Original model:

python eval_ood_detection.py --name Original --method mahalanobis

  • Evaluate the robust OOD detection performance of Mahalanobis using the Original model:

python eval_ood_detection.py --name Original --method mahalanobis --adv

Citation

Please cite our work if you use the codebase:

@article{chen2020robust,
  title={Robust Out-of-distribution Detection in Neural Networks},
  author={Chen, Jiefeng and Wu, Xi and Liang, Yingyu and Jha, Somesh and others},
  journal={arXiv preprint arXiv:2003.09711},
  year={2020}
}

License

Please refer to the LICENSE.

About the tool


Tool type(s):


Objective(s):



Country of origin:


Type of approach:







Programming languages:



Github stars:

  • 55

Github forks:

  • 6

Modify this tool

Use Cases

There is no use cases for this tool yet.

Would you like to submit a use case for this tool?

If you have used this tool, we would love to know more about your experience.

Add use case
catalogue Logos

Disclaimer: The tools and metrics featured herein are solely those of the originating authors and are not vetted or endorsed by the OECD or its member countries. The Organisation cannot be held responsible for possible issues resulting from the posting of links to third parties' tools and metrics on this catalogue. More on the methodology can be found at https://oecd.ai/catalogue/faq.