Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification (NeurIPS 2021)

Overview

Graph Posterior Network

This is the official code repository to the paper

Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification
Maximilian Stadler, Bertrand Charpentier, Simon Geisler, Daniel Zügner, Stephan Günnemann
Conference on Neural Information Processing Systems (NeurIPS) 2021.

[Paper]|Video - coming soon]

Diagram

Installation

We recommend running this code with its dependencies in a conda enviroment. To begin with, create a new conda environment with all the necessary dependencies assuming that you are in the root directory of this project:

conda env create -f gpn_environment.yml python==3.8 --force

Since the code is packaged, you will also have to setup the code accordingly. Assuming that you are in the root directory of this project, run:

conda activate gpn
pip3 install -e .

Data

Since we rely on published datasets from the Torch-Geometric package, you don't have to download datasets manually. When you run experiments on supported datasets, those will be downloaded and placed in the corresponding data directories. You can run the following datasets

  • CoraML
  • CiteSeer
  • PubMed
  • AmazonPhotos
  • AmazonComputers
  • CoauthorCS
  • CoauthorPhysics

Running Experiments

The experimental setup builds upon Sacred and configuring experiments in .yamlfiles. We will provide configurations

  • for vanilla node classification
  • leave-out-class experiments
  • experiments with isolated node perturbations
  • experiments for feature shifts
  • experiments for edge shifts

with a default fraction of perturbed nodes of 10%. We provide them for the smaller datasets (i.e. all except ogbn-arxiv) for hidden dimensions H=10 and H=16.

The main experimental script is train_and_eval.py. Assuming that you are in the root directory of this project for all further commands, you can run experiments with

Vanilla Node Classification

For the vanilla classification on the CoraML dataset with a hidden dimension of 16 or 10 respectively, run

python3 train_and_eval.py with configs/gpn/classification_gpn_16.yaml data.dataset=CoraML
python3 train_and_eval.py with configs/gpn/classification_gpn_10.yaml data.dataset=CoraML

If you have GPU-devices availale on your system, experiments will run on device 0 on default. If no CUDA-devices can be found, the code will revert back to running only on CPUs. Runs will produce assets per default. Also note that for running experiments for graphs under perturbations, you will have to run the corresponding vanilla classification experiment first.

Options for Feature Shifts

We consider random features from Unit Gaussian Distribution (normal) and from a Bernoulli Distribution (bernoulli_0.5). When using the configuration ood_features, you can change those settings (key ood_perturbation_type) in the command line together with the fraction of perturbed nodes (key ood_budget_per_graph) or in the corresponding configurations files, for example as

python3 train_and_eval.py with configs/gpn/ood_features_gpn_16.yaml data.dataset=CoraML data.ood_perturbation_type=normal data.ood_budget_per_graph=0.025
python3 train_and_eval.py with configs/gpn/ood_features_gpn_16.yaml data.dataset=CoraML data.ood_perturbation_type=bernoulli_0.5 data.ood_budget_per_graph=0.025

For experiments considering perturbations in an isolated fashion, this applies accordingly but without the fraction of perturbed nodes, e.g.

python3 train_and_eval.py with configs/gpn/ood_isolated_gpn_16.yaml data.dataset=CoraML data.ood_perturbation_type=normal
python3 train_and_eval.py with configs/gpn/ood_isolated_gpn_16.yaml data.dataset=CoraML data.ood_perturbation_type=bernoulli_0.5

Options for Edge Shifts

We consider random edge perturbations and the global and untargeted DICE attack. Those attacks can be set with the key ood_type which can be either set to random_attack_dice or random_edge_perturbations. As above, those settings can be changed in the command line or in the corresponding configuration files. While the key ood_budget_per_graph refers to the fraction of perturbed nodes in the paragraph above, it describes the fraction of perturbed edges in this case.

python3 train_and_eval.py with configs/gpn/ood_features_gpn_16.yaml data.dataset=CoraML data.ood_type=random_attack_dice data.ood_budget_per_graph=0.025
python3 train_and_eval.py with configs/gpn/ood_features_gpn_16.yaml data.dataset=CoraML data.ood_type=random_edge_perturbations data.ood_budget_per_graph=0.025

Further Options

With the settings above, you can reproduce our experimental results. If you want to change different architectural settings, simply change the corresponding keys in the configuration files with most of them being self-explanatory.

Structure

If you want to have a detailed look at our code, we give a brief overview of our code structure.

  • configs: directory for model configurations
  • data: directory for datasets
  • gpn: source code
    • gpn.data: code related to loading datasets and creating ID and OOD datasets
    • gpn.distributions: code related to custom distributions similar to torch.distributions
    • experiments: main routines for running experiments, i.e. loading configs, setting up datasets and models, training and evaluation
    • gpn.layers: custom layers
    • gpn.models: implementation of reference models and Graph Posterior Network (+ablated models)
    • gpn.nn: training related utilities like losses, metrics, or training engines
    • gpn.utils: general utility code
  • saved_experiments: directory for saved models
  • train_and_eval.py: main script for training & evaluation
  • gpn_qualitative_evaluation.ipynb: jupyter notebook which evaluates the results from Graph Posterior Network in a qualitative fashion

Note that we provide the implementations of most of our used reference models. Our main Graph Posterior Network model can be found in gpn.models.gpn_base.py. Ablated models can be found in a similar fashion, i.e. PostNet in gpn.models.gpn_postnet.py, PostNet+diffusion in gpn.models.gpn_postnet_diff.py and the model diffusiong log-beta scores in gpn.models.gpn_log_beta.py.

We provide all basic configurations for reference models in configs/reference. Note that some models have dependencies with others, e.g. running classification_gcn_dropout.yaml or classification_gcn_energy.yaml would require training the underlying GCN first by running classification_gcn.yaml first, running classification_gcn_ensemble.yaml would require training 10 GCNs first with init_no in 1...10, and running classification_sgcn.yaml (GKDE-GCN) would require training the teacher-GCN first by running classification_gcn.yaml and computing the kernel values by running classification_gdk.yaml first.

Cite

Please cite our paper if you use the model or this code in your own work.

@incollection{graph-postnet,
title={Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification},
author={Stadler, Maximilian and Charpentier, Bertrand and Geisler, Simon and Z{\"u}gner, Daniel and G{\"u}nnemann, Stephan},
booktitle = {Advances in Neural Information Processing Systems},
volume = {34},
publisher = {Curran Associates, Inc.},
year = {2021}
}
Introduction to Statistics and Basics of Mathematics for Data Science - The Hacker's Way

HackerMath for Machine Learning “Study hard what interests you the most in the most undisciplined, irreverent and original manner possible.” ― Richard

Amit Kapoor 1.4k Dec 22, 2022
Official code for "Distributed Deep Learning in Open Collaborations" (NeurIPS 2021)

Distributed Deep Learning in Open Collaborations This repository contains the code for the NeurIPS 2021 paper "Distributed Deep Learning in Open Colla

Yandex Research 96 Sep 15, 2022
E-Ink Magic Calendar that automatically syncs to Google Calendar and runs off a battery powered Raspberry Pi Zero

MagInkCal This repo contains the code needed to drive an E-Ink Magic Calendar that uses a battery powered (PiSugar2) Raspberry Pi Zero WH to retrieve

2.8k Dec 28, 2022
mPose3D, a mmWave-based 3D human pose estimation model.

mPose3D, a mmWave-based 3D human pose estimation model.

KylinChen 35 Nov 08, 2022
PyTorch implementation of Munchausen Reinforcement Learning based on DQN and SAC. Handles discrete and continuous action spaces

Exploring Munchausen Reinforcement Learning This is the project repository of my team in the "Advanced Deep Learning for Robotics" course at TUM. Our

Mohamed Amine Ketata 10 Mar 10, 2022
Official implementation of CATs: Cost Aggregation Transformers for Visual Correspondence NeurIPS'21

CATs: Cost Aggregation Transformers for Visual Correspondence NeurIPS'21 For more information, check out the paper on [arXiv]. Training with different

Sunghwan Hong 120 Jan 04, 2023
Graph Analysis From Scratch

Graph Analysis From Scratch Goal In this notebook we wanted to implement some functionalities to analyze a weighted graph only by using algorithms imp

Arturo Ghinassi 0 Sep 17, 2022
[CVPR2021 Oral] FFB6D: A Full Flow Bidirectional Fusion Network for 6D Pose Estimation.

FFB6D This is the official source code for the CVPR2021 Oral work, FFB6D: A Full Flow Biderectional Fusion Network for 6D Pose Estimation. (Arxiv) Tab

Yisheng (Ethan) He 201 Dec 28, 2022
A flexible submap-based framework towards spatio-temporally consistent volumetric mapping and scene understanding.

Panoptic Mapping This package contains panoptic_mapping, a general framework for semantic volumetric mapping. We provide, among other, a submap-based

ETHZ ASL 194 Dec 20, 2022
The challenge for Quantum Coalition Hackathon 2021

Qchack 2021 Google Challenge This is a challenge for the brave 2021 qchack.io participants. Instructions Hello, intrepid qchacker, welcome to the G|o

quantumlib 18 May 04, 2022
Flower classification model that classifies flowers in 10 classes made using transfer learning (~85% accuracy).

flower-classification-inceptionV3 Flower classification model that classifies flowers in 10 classes. Training and validation are done using a pre-anot

Ivan R. Mršulja 1 Dec 12, 2021
Official PyTorch implementation for FastDPM, a fast sampling algorithm for diffusion probabilistic models

Official PyTorch implementation for "On Fast Sampling of Diffusion Probabilistic Models". FastDPM generation on CIFAR-10, CelebA, and LSUN datasets. S

Zhifeng Kong 68 Dec 26, 2022
DISTIL: Deep dIverSified inTeractIve Learning.

DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active learning library built on py-torch for reducing labeling costs.

decile-team 110 Dec 06, 2022
Physics-informed Neural Operator for Learning Partial Differential Equation

PINO Physics-informed Neural Operator for Learning Partial Differential Equation Abstract: Machine learning methods have recently shown promise in sol

107 Jan 02, 2023
A library for preparing, training, and evaluating scalable deep learning hybrid recommender systems using PyTorch.

collie Collie is a library for preparing, training, and evaluating implicit deep learning hybrid recommender systems, named after the Border Collie do

ShopRunner 96 Dec 29, 2022
End-To-End Crowdsourcing

End-To-End Crowdsourcing Comparison of traditional crowdsourcing approaches to a state-of-the-art end-to-end crowdsourcing approach LTNet on sentiment

Andreas Koch 1 Mar 06, 2022
Time Series Forecasting with Temporal Fusion Transformer in Pytorch

Forecasting with the Temporal Fusion Transformer Multi-horizon forecasting often contains a complex mix of inputs – including static (i.e. time-invari

Nicolás Fornasari 6 Jan 24, 2022
PyTorch implementation of PSPNet

PSPNet with PyTorch Unofficial implementation of "Pyramid Scene Parsing Network" (https://arxiv.org/abs/1612.01105). This repository is just for caffe

Kazuto Nakashima 52 Nov 16, 2022
The audio-video synchronization of MKV Container Format is exploited to achieve data hiding

The audio-video synchronization of MKV Container Format is exploited to achieve data hiding, where the hidden data can be utilized for various management purposes, including hyper-linking, annotation

Maxim Zaika 1 Nov 17, 2021
[CVPR 2021] Pytorch implementation of Hijack-GAN: Unintended-Use of Pretrained, Black-Box GANs

Hijack-GAN: Unintended-Use of Pretrained, Black-Box GANs In this work, we propose a framework HijackGAN, which enables non-linear latent space travers

Hui-Po Wang 46 Sep 05, 2022