Denoising Normalizing Flow

Overview

Denoising Normalizing Flow

Christian Horvat and Jean-Pascal Pfister 2021

License: MIT

We combine Normalizing Flows (NFs) and Denoising Auto Encoder (DAE) by introducing the Denoising Normalizing Flow (DNF), a generative model able to

  1. approximate the data generating density p(x),
  2. generate new samples from p(x),
  3. infer low-dimensional latent variables.

As a classical NF degenerates for data living on a low-dimensional manifold embedded in high dimensions, the DNF inflates the manifold valued data using noise and learns a denoising mapping similar to DAE.

Related Work

The DNF is highly related to the Manifold Flow introduced by Johann Brehmer and Kyle Cramner. Also, our code is a cabon copy of their implementation with the following additions:

  1. The data can be inflated with Gaussian noise.
  2. We include the DNF as new mode for the ℳ-flow.
  3. New datasets, a thin spiral, a von Mises on a circle, and a mixture of von Mises on a sphere were added.
  4. A new folder, experiments/plots, for generating the images from the paper was added.
  5. A new folder, experiments/benchmarks, for benchmarking the DNF was added.
  6. The evaluate.py was modified and now includes the grid evaluation for the thin spiral and gan2d image manifold, the latent interpolations, the density estimation for the PAE, the latent density estimation on the thin spiral, and the KS statistics for the circle and sphere experiments.

The theoretical foundation of the DNF was developed in Density estimation on low-dimensional manifolds: an inflation-deflation approach.

Data sets

We trained the DNF and ℳ-flow on the following datasets:

Data set Data dimension Manifold dimension Arguments to train.py, and evaluate.py
Thin spiral 2 1 --dataset thin_spiral
2-D StyleGAN image manifold 64 x 64 x 3 2 --dataset gan2d
64-D StyleGAN image manifold 64 x 64 x 3 64 --dataset gan64d
CelebA-HQ 64 x 64 x 3 ? --dataset celeba

To use the model for your own data, you need to create a simulator (see experiments/datasets), and add it to experiments/datasets/init.py. If you have problems with that, please don't hesitate to contact us.

Benchmarks

We benchmark the DNF with the ℳ-flow, Probabilistic Auto Encoder (PAE), and InfoMax Variational Autoencoder. For that, we rely on the original implementations of those models, and modify them where appropriate, see experiments/benchmarks/vae and experiments/benchmarks/pae for more details.

Training & Evaluation

The configurations for the models and hyperparameter settings used in the paper can be found in experiments/configs.

Acknowledgements

We thank Johann Brehmer and Kyle Cramner for publishing their implementation of the Manifold Flow. For the experiments with the Probabilistic Auto-Encoder (V. Böhm and U. Seljak) and InfoMax Variational Autoencoder (A.L. Rezaabad, S. Vishwanath), we used the official implementations of these models. We thank these authors for this possibility.

Owner
CHrvt
CHrvt
Pytorch domain adaptation package

DomainAdaptation This package is created to tackle the problem of domain shifts when dealing with two domains of different feature distributions. In d

Institute of Computational Perception 7 Oct 22, 2022
(AAAI2022) Style Mixing and Patchwise Prototypical Matching for One-Shot Unsupervised Domain Adaptive Semantic Segmentation

SM-PPM This is a Pytorch implementation of our paper "Style Mixing and Patchwise Prototypical Matching for One-Shot Unsupervised Domain Adaptive Seman

W-zx-Y 10 Dec 07, 2022
An open-source Kazakh named entity recognition dataset (KazNERD), annotation guidelines, and baseline NER models.

Kazakh Named Entity Recognition This repository contains an open-source Kazakh named entity recognition dataset (KazNERD), named entity annotation gui

ISSAI 9 Dec 23, 2022
RaceBERT -- A transformer based model to predict race and ethnicty from names

RaceBERT -- A transformer based model to predict race and ethnicty from names Installation pip install racebert Using a virtual environment is highly

Prasanna Parasurama 3 Nov 02, 2022
Official Implementation for Encoding in Style: a StyleGAN Encoder for Image-to-Image Translation

Encoding in Style: a StyleGAN Encoder for Image-to-Image Translation We present a generic image-to-image translation framework, pixel2style2pixel (pSp

2.8k Dec 30, 2022
StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking

StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking Datasets You can download datasets that have been pre-pr

25 May 29, 2022
Code for the paper "TadGAN: Time Series Anomaly Detection Using Generative Adversarial Networks"

TadGAN: Time Series Anomaly Detection Using Generative Adversarial Networks This is a Python3 / Pytorch implementation of TadGAN paper. The associated

Arun 92 Dec 03, 2022
This repo is customed for VisDrone.

Object Detection for VisDrone(无人机航拍图像目标检测) My environment 1、Windows10 (Linux available) 2、tensorflow = 1.12.0 3、python3.6 (anaconda) 4、cv2 5、ensemble

53 Jul 17, 2022
The code for the NSDI'21 paper "BMC: Accelerating Memcached using Safe In-kernel Caching and Pre-stack Processing".

BMC The code for the NSDI'21 paper "BMC: Accelerating Memcached using Safe In-kernel Caching and Pre-stack Processing". BibTex entry available here. B

Orange 383 Dec 16, 2022
Deep learning library for solving differential equations and more

DeepXDE Voting on whether we should have a Slack channel for discussion. DeepXDE is a library for scientific machine learning. Use DeepXDE if you need

Lu Lu 1.4k Dec 29, 2022
Code to reproduce the experiments from our NeurIPS 2021 paper " The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective"

Code To run: python runner.py new --save SAVE_NAME --data PATH_TO_DATA_DIR --dataset DATASET --model model_name [options] --n 1000 - train - t

Geoff Pleiss 5 Dec 12, 2022
Finite-temperature variational Monte Carlo calculation of uniform electron gas using neural canonical transformation.

CoulombGas This code implements the neural canonical transformation approach to the thermodynamic properties of uniform electron gas. Building on JAX,

FermiFlow 9 Mar 03, 2022
🧠 A PyTorch implementation of 'Deep CORAL: Correlation Alignment for Deep Domain Adaptation.', ECCV 2016

Deep CORAL A PyTorch implementation of 'Deep CORAL: Correlation Alignment for Deep Domain Adaptation. B Sun, K Saenko, ECCV 2016' Deep CORAL can learn

Andy Hsu 200 Dec 25, 2022
Interpolation-based reduced-order models

Interpolation-reduced-order-models Interpolation-based reduced-order models High-fidelity computational fluid dynamics (CFD) solutions are time consum

Donovan Blais 1 Jan 10, 2022
Fairness Metrics: All you need to know

Fairness Metrics: All you need to know Testing machine learning software for ethical bias has become a pressing current concern. Recent research has p

Anonymous2020 1 Jan 17, 2022
A Machine Teaching Framework for Scalable Recognition

MEMORABLE This repository contains the source code accompanying our ICCV 2021 paper. A Machine Teaching Framework for Scalable Recognition Pei Wang, N

2 Dec 08, 2021
Knowledge Management for Humans using Machine Learning & Tags

HyperTag HyperTag helps humans intuitively express how they think about their files using tags and machine learning.

Ravn Tech, Inc. 165 Nov 04, 2022
Enabling Lightweight Fine-tuning for Pre-trained Language Model Compression based on Matrix Product Operators

Enabling Lightweight Fine-tuning for Pre-trained Language Model Compression based on Matrix Product Operators This is our Pytorch implementation for t

RUCAIBox 12 Jul 22, 2022
EdiBERT, a generative model for image editing

EdiBERT, a generative model for image editing EdiBERT is a generative model based on a bi-directional transformer, suited for image manipulation. The

16 Dec 07, 2022
Scalable training for dense retrieval models.

Scalable implementation of dense retrieval. Training on cluster By default it trains locally: PYTHONPATH=.:$PYTHONPATH python dpr_scale/main.py traine

Facebook Research 90 Dec 28, 2022