Experiments for Neural Flows paper

Overview

Neural Flows: Efficient Alternative to Neural ODEs [arxiv]

TL;DR: We directly model the neural ODE solutions with neural flows, which is much faster and achieves better results on time series applications, since it avoids using expensive numerical solvers.

image

Marin Biloš, Johanna Sommer, Syama Sundar Rangapuram, Tim Januschowski, Stephan Günnemann

Abstract: Neural ordinary differential equations describe how values change in time. This is the reason why they gained importance in modeling sequential data, especially when the observations are made at irregular intervals. In this paper we propose an alternative by directly modeling the solution curves - the flow of an ODE - with a neural network. This immediately eliminates the need for expensive numerical solvers while still maintaining the modeling capability of neural ODEs. We propose several flow architectures suitable for different applications by establishing precise conditions on when a function defines a valid flow. Apart from computational efficiency, we also provide empirical evidence of favorable generalization performance via applications in time series modeling, forecasting, and density estimation.

This repository acts as a supplementary material which implements the models and experiments as described in the main paper. The definition of models relies on the stribor package for normalizing and neural flows. The baselines use torchdiffeq package for differentiable ODE solvers.

Installation

Install the local package nfe (which will also install all the dependencies):

pip install -e .

Download data

Download and preprocess real-world data and generate synthetic data (or run commands in download_all.sh manually):

. scripts/download_all.sh

Many experiments will automatically download data if it's not already downloaded so this step is optional.

Note: MIMIC-III and IV have to be downloaded manually. Use notebooks in data_preproc to preprocess data.

After downloading everything, your directory tree should look like this:

├── nfe
│   ├── experiments
│   │   ├── base_experiment.py
│   │   ├── data
│   │   │   ├── activity
│   │   │   ├── hopper
│   │   │   ├── mimic3
│   │   │   ├── mimic4
│   │   │   ├── physionet
│   │   │   ├── stpp
│   │   │   ├── synth
│   │   │   └── tpp
│   │   ├── gru_ode_bayes
│   │   ├── latent_ode
│   │   ├── stpp
│   │   ├── synthetic
│   │   └── tpp
│   ├── models
│   └── train.py
├── scripts
│   ├── download_all.sh
│   └── run_all.sh
└── setup.py

Models

Models are located in nfe/models. It contains the implementation of CouplingFlow and ResNetFlow. The ODE models and continuous (ODE or flow-based) GRU and LSTM layers can be found in the same directory.

Example: Coupling flow

import torch
from nfe import CouplingFlow

dim = 4
model = CouplingFlow(
    dim,
    n_layers=2, # Number of flow layers
    hidden_dims=[32, 32], # Hidden layers in single flow
    time_net='TimeLinear', # Time embedding network
)

t = torch.rand(3, 10, 1) # Time points at which IVP is evaluated
x0 = torch.randn(3, 1, dim) # Initial conditions at t=0

xt = model(x0, t) # IVP solutions at t given x0
xt.shape # torch.Size([3, 10, 4])

Example: GRU flow

import torch
from nfe import GRUFlow

dim = 4
model = GRUFlow(
    dim,
    n_layers=2, # Number of flow layers
    hidden_dims=[32, 32], # Hidden layers in single flow
    time_net='TimeTanh', # Time embedding network
)

t = torch.rand(3, 10, 1) # Time points at which IVP is evaluated
x = torch.randn(3, 10, dim) # Initial conditions, RNN inputs

xt = model(x, t) # IVP solutions at t_i given x_{1:i}
xt.shape # torch.Size([3, 10, 4])

Experiments

Run all experiments: . scripts/run_all.sh. Or run individual commands manually.

Synthetic

Example:

python -m nfe.train --experiment synthetic --data [ellipse|sawtooth|sink|square|triangle] --model [ode|flow] --flow-model [coupling|resnet] --solver [rk4|dopri5]

Smoothing

Example:

python -m nfe.train --experiment latent_ode --data [activity|hopper|physionet] --classify [0|1] --model [ode|flow] --flow-model [coupling|resnet]

Reference:

  • Yulia Rubanova, Ricky Chen, David Duvenaud. "Latent ODEs for Irregularly-Sampled Time Series" (2019) [paper]. We adapted the code from here.

Filtering

Request MIMIC-III and IV data, and download locally. Use notebooks to preprocess data.

Example:

python -m nfe.train --experiment gru_ode_bayes --data [mimic3|mimic4] --model [ode|flow] --odenet gru --flow-model [gru|resnet]

Reference:

  • Edward De Brouwer, Jaak Simm, Adam Arany, Yves Moreau. "GRU-ODE-Bayes: Continuous modeling of sporadically-observed time series" (2019) [paper]. We adapted the code from here.

Temporal point process

Example:

python -m nfe.train --experiment tpp --data [poisson|renewal|hawkes1|hawkes2|mooc|reddit|wiki] --model [rnn|ode|flow] --flow-model [coupling|resnet] --decoder [continuous|mixture] --rnn [gru|lstm] --marks [0|1]

Reference:

  • Junteng Jia, Austin R. Benson. "Neural Jump Stochastic Differential Equations" (2019) [paper]. We adapted the code from here.

Spatio-temporal

Example:

python -m nfe.train --experiment stpp --data [bike|covid|earthquake] --model [ode|flow] --density-model [independent|attention]

Reference:

  • Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel. "Neural Spatio-Temporal Point Processes" (2021) [paper]. We adapted the code from here.

Citation

@article{bilos2021neuralflows,
  title={{N}eural Flows: {E}fficient Alternative to Neural {ODE}s},
  author={Bilo{\v{s}}, Marin and Sommer, Johanna and Rangapuram, Syama Sundar and Januschowski, Tim and G{\"u}nnemann, Stephan},
  journal={Advances in Neural Information Processing Systems},
  year={2021}
}
A very simple tool to rewrite parameters such as attributes and constants for OPs in ONNX models. Simple Attribute and Constant Modifier for ONNX.

sam4onnx A very simple tool to rewrite parameters such as attributes and constants for OPs in ONNX models. Simple Attribute and Constant Modifier for

Katsuya Hyodo 6 May 15, 2022
Localization Distillation for Object Detection

Localization Distillation for Object Detection This repo is based on mmDetection. This is the code for our paper: Localization Distillation

274 Dec 26, 2022
Commonality in Natural Images Rescues GANs: Pretraining GANs with Generic and Privacy-free Synthetic Data - Official PyTorch Implementation (CVPR 2022)

Commonality in Natural Images Rescues GANs: Pretraining GANs with Generic and Privacy-free Synthetic Data (CVPR 2022) Potentials of primitive shapes f

31 Sep 27, 2022
Official pytorch implementation of "DSPoint: Dual-scale Point Cloud Recognition with High-frequency Fusion"

DSPoint Official implementation of "DSPoint: Dual-scale Point Cloud Recognition with High-frequency Fusion". Paper link: https://arxiv.org/abs/2111.10

Ziyao Zeng 14 Feb 26, 2022
Vision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)

Vision-Language Transformer and Query Generation for Referring Segmentation Please consider citing our paper in your publications if the project helps

Henghui Ding 143 Dec 23, 2022
Establishing Strong Baselines for TripClick Health Retrieval; ECIR 2022

TripClick Baselines with Improved Training Data Welcome 🙌 to the hub-repo of our paper: Establishing Strong Baselines for TripClick Health Retrieval

Sebastian Hofstätter 3 Nov 03, 2022
[ICCV2021] Official Pytorch implementation for SDGZSL (Semantics Disentangling for Generalized Zero-Shot Learning)

Semantics Disentangling for Generalized Zero-shot Learning This is the official implementation for paper Zhi Chen, Yadan Luo, Ruihong Qiu, Zi Huang, J

25 Dec 06, 2022
Scalable and Elastic Deep Reinforcement Learning Using PyTorch. Please star. 🔥

ElegantRL “小雅”: Scalable and Elastic Deep Reinforcement Learning ElegantRL is developed for researchers and practitioners with the following advantage

AI4Finance Foundation 2.5k Jan 05, 2023
Unofficial implementation of Google "CutPaste: Self-Supervised Learning for Anomaly Detection and Localization" in PyTorch

CutPaste CutPaste: image from paper Unofficial implementation of Google's "CutPaste: Self-Supervised Learning for Anomaly Detection and Localization"

Lilit Yolyan 59 Nov 27, 2022
x-transformers-paddle 2.x version

x-transformers-paddle x-transformers-paddle 2.x version paddle 2.x版本 https://github.com/lucidrains/x-transformers 。 requirements paddlepaddle-gpu==2.2

yujun 7 Dec 08, 2022
CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification (ICCV2021)

CM-NAS Official Pytorch code of paper CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification in ICCV2021. Vis

JDAI-CV 40 Nov 25, 2022
Veri Setinizi Yolov5 Formatına Dönüştürün

Veri Setinizi Yolov5 Formatına Dönüştürün! Bu Repo da Neler Var? Xml Formatındaki Veri Setini .Txt Formatına Çevirme Xml Formatındaki Dosyaları Silme

Kadir Nar 4 Aug 22, 2022
Pytorch implementation of "Forward Thinking: Building and Training Neural Networks One Layer at a Time"

forward-thinking-pytorch Pytorch implementation of Forward Thinking: Building and Training Neural Networks One Layer at a Time Requirements Python 2.7

Kim Heecheol 65 Oct 06, 2022
Official implementation of EfficientPose

EfficientPose This is the official implementation of EfficientPose. We based our work on the Keras EfficientDet implementation xuannianz/EfficientDet

2 May 17, 2022
Semantic Segmentation for Real Point Cloud Scenes via Bilateral Augmentation and Adaptive Fusion (CVPR 2021)

Semantic Segmentation for Real Point Cloud Scenes via Bilateral Augmentation and Adaptive Fusion (CVPR 2021) This repository is for BAAF-Net introduce

90 Dec 29, 2022
Code for our ICASSP 2021 paper: SA-Net: Shuffle Attention for Deep Convolutional Neural Networks

SA-Net: Shuffle Attention for Deep Convolutional Neural Networks (paper) By Qing-Long Zhang and Yu-Bin Yang [State Key Laboratory for Novel Software T

Qing-Long Zhang 199 Jan 08, 2023
Python package to add text to images, textures and different backgrounds

nider Python package for text images generation and watermarking Free software: MIT license Documentation: https://nider.readthedocs.io. nider is an a

Vladyslav Ovchynnykov 131 Dec 30, 2022
Rafael Project- Classifying rockets to different types using data science algorithms.

Rocket-Classify Rafael Project- Classifying rockets to different types using data science algorithms. In this project we received data base with data

Hadassah Engel 5 Sep 18, 2021
UA-GEC: Grammatical Error Correction and Fluency Corpus for the Ukrainian Language

UA-GEC: Grammatical Error Correction and Fluency Corpus for the Ukrainian Language This repository contains UA-GEC data and an accompanying Python lib

Grammarly 226 Dec 29, 2022
[NeurIPS'20] Self-supervised Co-Training for Video Representation Learning. Tengda Han, Weidi Xie, Andrew Zisserman.

CoCLR: Self-supervised Co-Training for Video Representation Learning This repository contains the implementation of: InfoNCE (MoCo on videos) UberNCE

Tengda Han 271 Jan 02, 2023