Bayesian Neural Networks in PyTorch

Overview

We present the new scheme to compute Monte Carlo estimator in Bayesian VI settings with almost no memory cost in GPU, regardles of the number of samples. Our method is described in the paper (UAI2021): "Graph Reparameterizations for Enabling 1000+ Monte Carlo Iterations in Bayesian Deep Neural Networks".

In addition, we provide an implementation framework to make your deterministic network Bayesian in PyTorch.

If you like our work, please click on a star. If you use our code in your research projects, please cite our paper above.

Bayesify your Neural Network

There are 3 main files which help you to Bayesify your deterministic network:

  1. bayes_layers.py - file contains a bayesian implementation of convolution(1d, 2d, 3d, transpose) and linear layers, according to approx posterior from Location-Scale family, i.e. which has 2 parameters mu and sigma. This file contains general definition, independent of specific distribution, as long as distribution contains 2 parameters mu and sigma. It uses forward method defined in vi_posteriors.py file. One of the main arguments for redefined classes is approx_post, which defined which posterior class to use from vi_posteriors.py. Please, specify this name same way as defined class in vi_posteriors.py. For example, if vi_posteriors.py contains class Gaus, then approx_post='Gaus'.

  2. vi_posteriors.py - file describes forward method, including kl term, for different approximate posterior distributions. Current implementation contains following disutributions:

  • Radial
  • Gaus

If you would like to implement your own class of distrubtions, in vi_posteriors.py copy one of defined classes and redefine following functions: forward(obj, x, fun=""), get_kl(obj, n_mc_iter, device).

It also contains usefull Utils class which provides

  • definition of loss functions:
    • get_loss_categorical
    • get_loss_normal,
  • different beta coefficients: get_beta for KL term and
  • allows to turn on/off computing the KL term, with function set_compute_kl. this is useful, when you perform testing/evaluation, and kl term is not required to be computed. In that case it accelerates computations.

Below is an example to bayesify your own network. Note the forward method, which handles situations if a layer is not of a Bayesian type, and thus, does not return kl term, e.g. ReLU(x).

import bayes_layers as bl # important for defining bayesian layers
class YourBayesNet(nn.Module):
    def __init__(self, num_classes, in_channels, 
                 **bayes_args):
        super(YourBayesNet, self).__init__()
        self.conv1 = bl.Conv2d(in_channels, 64,
                               kernel_size=11, stride=4,
                               padding=5,
                               **bayes_args)
        self.classifier = bl.Linear(1*1*128,
                                    num_classes,
                                    **bayes_args)
        self.layers = [self.conv1, nn.ReLU(), self.classifier]
        
    def forward(self, x):
        kl = 0
        for layer in self.layers:
            tmp = layer(x)
            if isinstance(tmp, tuple):
                x, kl_ = tmp
                kl += kl_
            else:
                x = tmp

        x = x.view(x.size(0), -1)
        logits, _kl = self.classifier.forward(x)
        kl += _kl
        
        return logits, kl

Then later in the main file during training, you can either use one of the loss functions, defined in utils as following:

output, kl = model(inputs)
kl = kl.mean()  # if several gpus are used to split minibatch

loss, _ = vi.Utils.get_loss_categorical(kl, output, targets, beta=beta) 
#loss, _ = vi.Utils.get_loss_normal(kl, output, targets, beta=beta) 
loss.backward()

or design your own, e.g.

loss = kl_coef*kl - loglikelihood
loss.backward()
  1. uncertainty_estimate.py - file describes set of functions to perform uncertainty estimation, e.g.
  • get_prediction_class - function which return the most common class in iterations
  • summary_class - function creates a summary file with statistics

Current implementation of networks for different problems

Classification

Script bayesian_dnn_class/main.py is the main executable code and all standard DNN models are located in bayesian_dnn_class/models, and are:

  • AlexNet
  • Fully Connected
  • DenseNet
  • ResNet
  • VGG
Owner
Jurijs Nazarovs
PhD student in statistics at the UW-Madison.
Jurijs Nazarovs
code for paper "Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?"

Does Unsupervised Architecture Representation Learning Help Neural Architecture Search? Code for paper: Does Unsupervised Architecture Representation

39 Dec 17, 2022
Deep High-Resolution Representation Learning for Human Pose Estimation

Deep High-Resolution Representation Learning for Human Pose Estimation (accepted to CVPR2019) News If you are interested in internship or research pos

HRNet 167 Dec 27, 2022
SysWhispers Shellcode Loader

Shhhloader Shhhloader is a SysWhispers Shellcode Loader that is currently a Work in Progress. It takes raw shellcode as input and compiles a C++ stub

icyguider 630 Jan 03, 2023
Medical Insurance Cost Prediction using Machine earning

Medical-Insurance-Cost-Prediction-using-Machine-learning - Here in this project, I will use regression analysis to predict medical insurance cost for people in different regions, and based on several

1 Dec 27, 2021
Recursive Bayesian Networks

Recursive Bayesian Networks This repository contains the code to reproduce the results from the NeurIPS 2021 paper Lieck R, Rohrmeier M (2021) Recursi

Robert Lieck 11 Oct 18, 2022
PointCloud Annotation Tools, support to label object bound box, ground, lane and kerb

PointCloud Annotation Tools, support to label object bound box, ground, lane and kerb

halo 368 Dec 06, 2022
UPSNet: A Unified Panoptic Segmentation Network

UPSNet: A Unified Panoptic Segmentation Network Introduction UPSNet is initially described in a CVPR 2019 oral paper. Disclaimer This repository is te

Uber Research 622 Dec 26, 2022
ComPhy: Compositional Physical Reasoning ofObjects and Events from Videos

ComPhy This repository holds the code for the paper. ComPhy: Compositional Physical Reasoning ofObjects and Events from Videos, (Under review) PDF Pro

29 Dec 29, 2022
The code for paper "Contrastive Spatio-Temporal Pretext Learning for Self-supervised Video Representation" which is accepted by AAAI 2022

Contrastive Spatio Temporal Pretext Learning for Self-supervised Video Representation (AAAI 2022) The code for paper "Contrastive Spatio-Temporal Pret

8 Jun 30, 2022
Pytorch implementation of FlowNet by Dosovitskiy et al.

FlowNetPytorch Pytorch implementation of FlowNet by Dosovitskiy et al. This repository is a torch implementation of FlowNet, by Alexey Dosovitskiy et

Clément Pinard 762 Jan 02, 2023
Official implementation of "SinIR: Efficient General Image Manipulation with Single Image Reconstruction" (ICML 2021)

SinIR (Official Implementation) Requirements To install requirements: pip install -r requirements.txt We used Python 3.7.4 and f-strings which are in

47 Oct 11, 2022
Code release for "MERLOT Reserve: Neural Script Knowledge through Vision and Language and Sound"

merlot_reserve Code release for "MERLOT Reserve: Neural Script Knowledge through Vision and Language and Sound" MERLOT Reserve (in submission) is a mo

Rowan Zellers 92 Dec 11, 2022
Codeflare - Scale complex AI/ML pipelines anywhere

Scale complex AI/ML pipelines anywhere CodeFlare is a framework to simplify the integration, scaling and acceleration of complex multi-step analytics

CodeFlare 169 Nov 29, 2022
DuBE: Duple-balanced Ensemble Learning from Skewed Data

DuBE: Duple-balanced Ensemble Learning from Skewed Data "Towards Inter-class and Intra-class Imbalance in Class-imbalanced Learning" (IEEE ICDE 2022 S

6 Nov 12, 2022
Draw like Bob Ross using the power of Neural Networks (With PyTorch)!

Draw like Bob Ross using the power of Neural Networks! (+ Pytorch) Learning Process Visualization Getting started Install dependecies Requires python3

Kendrick Tan 116 Mar 07, 2022
Supplemental learning materials for "Fourier Feature Networks and Neural Volume Rendering"

Fourier Feature Networks and Neural Volume Rendering This repository is a companion to a lecture given at the University of Cambridge Engineering Depa

Matthew A Johnson 133 Dec 26, 2022
[BMVC'21] Official PyTorch Implementation of Grounded Situation Recognition with Transformers

Grounded Situation Recognition with Transformers Paper | Model Checkpoint This is the official PyTorch implementation of Grounded Situation Recognitio

Junhyeong Cho 18 Jul 19, 2022
[ICCV 2021] Released code for Causal Attention for Unbiased Visual Recognition

CaaM This repo contains the codes of training our CaaM on NICO/ImageNet9 dataset. Due to my recent limited bandwidth, this codebase is still messy, wh

Wang Tan 66 Dec 31, 2022
Code for paper: Group-CAM: Group Score-Weighted Visual Explanations for Deep Convolutional Networks

Group-CAM By Zhang, Qinglong and Rao, Lu and Yang, Yubin [State Key Laboratory for Novel Software Technology at Nanjing University] This repo is the o

zhql 98 Nov 16, 2022
Code for the paper: Hierarchical Reinforcement Learning With Timed Subgoals, published at NeurIPS 2021

Hierarchical reinforcement learning with Timed Subgoals (HiTS) This repository contains code for reproducing experiments from our paper "Hierarchical

Autonomous Learning Group 21 Dec 03, 2022