Principled Detection of Out-of-Distribution Examples in Neural Networks

Overview

ODIN: Out-of-Distribution Detector for Neural Networks

This is a PyTorch implementation for detecting out-of-distribution examples in neural networks. The method is described in the paper Principled Detection of Out-of-Distribution Examples in Neural Networks by S. Liang, Yixuan Li and R. Srikant. The method reduces the false positive rate from the baseline 34.7% to 4.3% on the DenseNet (applied to CIFAR-10) when the true positive rate is 95%.

Experimental Results

We used two neural network architectures, DenseNet-BC and Wide ResNet. The PyTorch implementation of DenseNet-BC is provided by Andreas Veit and Brandon Amos. The PyTorch implementation of Wide ResNet is provided by Sergey Zagoruyko. The experimental results are shown as follows. The definition of each metric can be found in the paper. performance

Pre-trained Models

We provide four pre-trained neural networks: (1) two DenseNet-BC networks trained on CIFAR-10 and CIFAR-100 respectively; (2) two Wide ResNet networks trained on CIFAR-10 and CIFAR-100 respectively. The test error rates are given by:

Architecture CIFAR-10 CIFAR-100
DenseNet-BC 4.81 22.37
Wide ResNet 3.71 19.86

Running the code

Dependencies

  • CUDA 8.0

  • PyTorch

  • Anaconda2 or 3

  • At least three GPU

    Note: Reproducing results of DenseNet-BC only requires one GPU, but reproducing results of Wide ResNet requires three GPUs. Single GPU version for Wide ResNet will be released soon in the future.

Downloading Out-of-Distribtion Datasets

We provide download links of five out-of-distributin datasets:

Here is an example code of downloading Tiny-ImageNet (crop) dataset. In the root directory, run

mkdir data
cd data
wget https://www.dropbox.com/s/avgm2u562itwpkl/Imagenet.tar.gz
tar -xvzf Imagenet.tar.gz
cd ..

Downloading Neural Network Models

We provide download links of four pre-trained models.

Here is an example code of downloading DenseNet-BC trained on CIFAR-10. In the root directory, run

mkdir models
cd models
wget https://www.dropbox.com/s/wr4kjintq1tmorr/densenet10.pth.tar.gz
tar -xvzf densenet10.pth.tar.gz
cd ..

Running

Here is an example code reproducing the results of DenseNet-BC trained on CIFAR-10 where TinyImageNet (crop) is the out-of-distribution dataset. The temperature is set as 1000, and perturbation magnitude is set as 0.0014. In the root directory, run

cd code
# model: DenseNet-BC, in-distribution: CIFAR-10, out-distribution: TinyImageNet (crop)
# magnitude: 0.0014, temperature 1000, gpu: 0
python main.py --nn densenet10 --out_dataset Imagenet --magnitude 0.0014 --temperature 1000 --gpu 0

Note: Please choose arguments according to the following.

args

  • args.nn: the arguments of neural networks are shown as follows

    Nerual Network Models args.nn
    DenseNet-BC trained on CIFAR-10 densenet10
    DenseNet-BC trained on CIFAR-100 densenet100
  • args.out_dataset: the arguments of out-of-distribution datasets are shown as follows

    Out-of-Distribution Datasets args.out_dataset
    Tiny-ImageNet (crop) Imagenet
    Tiny-ImageNet (resize) Imagenet_resize
    LSUN (crop) LSUN
    LSUN (resize) LSUN_resize
    iSUN iSUN
    Uniform random noise Uniform
    Gaussian random noise Gaussian
  • args.magnitude: the optimal noise magnitude can be found below. In practice, the optimal choices of noise magnitude are model-specific and need to be tuned accordingly.

    Out-of-Distribution Datasets densenet10 densenet100 wideresnet10 wideresnet100
    Tiny-ImageNet (crop) 0.0014 0.0014 0.0005 0.0028
    Tiny-ImageNet (resize) 0.0014 0.0028 0.0011 0.0028
    LSUN (crop) 0 0.0028 0 0.0048
    LSUN (resize) 0.0014 0.0028 0.0006 0.002
    iSUN 0.0014 0.0028 0.0008 0.0028
    Uniform random noise 0.0014 0.0028 0.0014 0.0028
    Gaussian random noise 0.0014 0.0028 0.0014 0.0028
  • args.temperature: temperature is set to 1000 in all cases.

  • args.gpu: make sure you use the following gpu when running the code:

    Neural Network Models args.gpu
    densenet10 0
    densenet100 0
    wideresnet10 1
    wideresnet100 2

Outputs

Here is an example of output.

Neural network architecture:          DenseNet-BC-100
In-distribution dataset:                     CIFAR-10
Out-of-distribution dataset:     Tiny-ImageNet (crop)

                          Baseline         Our Method
FPR at TPR 95%:              34.8%               4.3% 
Detection error:              9.9%               4.6%
AUROC:                       95.3%              99.1%
AUPR In:                     96.4%              99.2%
AUPR Out:                    93.8%              99.1%
Code for the paper Task Agnostic Morphology Evolution.

Task-Agnostic Morphology Optimization This repository contains code for the paper Task-Agnostic Morphology Evolution by Donald (Joey) Hejna, Pieter Ab

Joey Hejna 18 Aug 04, 2022
Python wrapper of LSODA (solving ODEs) which can be called from within numba functions.

numbalsoda numbalsoda is a python wrapper to the LSODA method in ODEPACK, which is for solving ordinary differential equation initial value problems.

Nick Wogan 52 Jan 09, 2023
PyTorch implementation of the paper Deep Networks from the Principle of Rate Reduction

Deep Networks from the Principle of Rate Reduction This repository is the official PyTorch implementation of the paper Deep Networks from the Principl

459 Dec 27, 2022
Neural Scene Flow Fields for Space-Time View Synthesis of Dynamic Scenes

Neural Scene Flow Fields PyTorch implementation of paper "Neural Scene Flow Fields for Space-Time View Synthesis of Dynamic Scenes", CVPR 2021 [Projec

Zhengqi Li 583 Dec 30, 2022
Vision-and-Language Navigation in Continuous Environments using Habitat

Vision-and-Language Navigation in Continuous Environments (VLN-CE) Project Website — VLN-CE Challenge — RxR-Habitat Challenge Official implementations

Jacob Krantz 132 Jan 02, 2023
An Straight Dilated Network with Wavelet for image Deblurring

SDWNet: A Straight Dilated Network with Wavelet Transformation for Image Deblurring(offical) 1. Introduction This repo is not only used for our paper(

FlyEgle 41 Jan 04, 2023
Stacked Generative Adversarial Networks

Stacked Generative Adversarial Networks This repository contains code for the paper "Stacked Generative Adversarial Networks", CVPR 2017. Part of the

Xun Huang 241 May 07, 2022
Official repository with code and data accompanying the NAACL 2021 paper "Hurdles to Progress in Long-form Question Answering" (https://arxiv.org/abs/2103.06332).

Hurdles to Progress in Long-form Question Answering This repository contains the official scripts and datasets accompanying our NAACL 2021 paper, "Hur

Kalpesh Krishna 41 Nov 08, 2022
Instance Segmentation by Jointly Optimizing Spatial Embeddings and Clustering Bandwidth

Instance segmentation by jointly optimizing spatial embeddings and clustering bandwidth This codebase implements the loss function described in: Insta

209 Dec 07, 2022
Code for classifying international patents based on the text of their titles/abstracts

Patent Classification Goal: To train a machine learning classifier that can automatically classify international patents downloaded from the WIPO webs

Prashanth Rao 1 Nov 08, 2022
Fuzzing the Kernel Using Unicornafl and AFL++

Unicorefuzz Fuzzing the Kernel using UnicornAFL and AFL++. For details, skim through the WOOT paper or watch this talk at CCCamp19. Is it any good? ye

Security in Telecommunications 283 Dec 26, 2022
Robustness via Cross-Domain Ensembles

Robustness via Cross-Domain Ensembles [ICCV 2021, Oral] This repository contains tools for training and evaluating: Pretrained models Demo code Traini

Visual Intelligence & Learning Lab, Swiss Federal Institute of Technology (EPFL) 27 Dec 23, 2022
The Codebase for Causal Distillation for Language Models.

Causal Distillation for Language Models Zhengxuan Wu*,Atticus Geiger*, Josh Rozner, Elisa Kreiss, Hanson Lu, Thomas Icard, Christopher Potts, Noah D.

Zen 20 Dec 31, 2022
Dense Passage Retriever - is a set of tools and models for open domain Q&A task.

Dense Passage Retrieval Dense Passage Retrieval (DPR) - is a set of tools and models for state-of-the-art open-domain Q&A research. It is based on the

Meta Research 1.1k Jan 03, 2023
DPT: Deformable Patch-based Transformer for Visual Recognition (ACM MM2021)

DPT This repo is the official implementation of DPT: Deformable Patch-based Transformer for Visual Recognition (ACM MM2021). We provide code and model

CASIA-IVA-Lab 111 Dec 21, 2022
Official implementation of "One-Shot Voice Conversion with Weight Adaptive Instance Normalization".

One-Shot Voice Conversion with Weight Adaptive Instance Normalization By Shengjie Huang, Yanyan Xu*, Dengfeng Ke*, Mingjie Chen, Thomas Hain. This rep

31 Dec 07, 2022
A Review of Deep Learning Techniques for Markerless Human Motion on Synthetic Datasets

HOW TO USE THIS PROJECT A Review of Deep Learning Techniques for Markerless Human Motion on Synthetic Datasets Based on DeepLabCut toolbox, we run wit

1 Jan 10, 2022
CoaT: Co-Scale Conv-Attentional Image Transformers

CoaT: Co-Scale Conv-Attentional Image Transformers Introduction This repository contains the official code and pretrained models for CoaT: Co-Scale Co

mlpc-ucsd 191 Dec 03, 2022
Code & Experiments for "LILA: Language-Informed Latent Actions" to be presented at the Conference on Robot Learning (CoRL) 2021.

LILA LILA: Language-Informed Latent Actions Code and Experiments for Language-Informed Latent Actions (LILA), for using natural language to guide assi

Sidd Karamcheti 11 Nov 25, 2022
A Fast Monotone Rotating Shallow Water model

pyRSW A Fast Monotone Rotating Shallow Water model How fast? As fast as a sustained 2 Gflop/s per core on a 2.5 GHz cpu (or 2048 Gflop/s with 1024 cor

Guillaume Roullet 13 Sep 28, 2022