A Python library for Deep Probabilistic Modeling

Overview

MIT license PyPI version

Logo

Abstract

DeeProb-kit is a Python library that implements deep probabilistic models such as various kinds of Sum-Product Networks, Normalizing Flows and their possible combinations for probabilistic inference. Some models are implemented using PyTorch for fast training and inference on GPUs.

Features

  • Inference algorithms for SPNs. 1 4
  • Learning algorithms for SPNs structure. 1 2 3 4
  • Chow-Liu Trees (CLT) as SPN leaves. 11 12
  • Batch Expectation-Maximization (EM) for SPNs with arbitrarily leaves. 13 14
  • Structural marginalization and pruning algorithms for SPNs.
  • High-order moments computation for SPNs.
  • JSON I/O operations for SPNs and CLTs. 4
  • Plotting operations based on NetworkX for SPNs and CLTs. 4
  • Randomized And Tensorized SPNs (RAT-SPNs) using PyTorch. 5
  • Masked Autoregressive Flows (MAFs) using PyTorch. 6
  • Real Non-Volume-Preserving (RealNVP) and Non-linear Independent Component Estimation (NICE) flows. 7 8
  • Deep Generalized Convolutional SPNs (DGC-SPNs) using PyTorch. 10

The collection of implemented models is summarized in the following table. The supported data dimensionality for each model is showed in the Input Dimensionality column. Moreover, the Supervised column tells which model is suitable for a supervised learning task, other than density estimation task.

Model Description Input Dimensionality Supervised
Binary-CLT Binary Chow-Liu Tree (CLT) D
SPN Vanilla Sum-Product Network, using LearnSPN D
RAT-SPN Randomized and Tensorized Sum-Product Network D
DGC-SPN Deep Generalized Convolutional Sum-Product Network (1, D, D); (3, D, D)
MAF Masked Autoregressive Flow D
NICE Non-linear Independent Components Estimation Flow (1, H, W); (3, H, W)
RealNVP Real-valued Non-Volume-Preserving Flow (1, H, W); (3, H, W)

Installation & Documentation

The library can be installed either from PIP repository or by source code.

# Install from PIP repository
pip install deeprob-kit
# Install from `main` git branch
pip install -e git+https://github.com/deeprob-org/[email protected]#egg=deeprob-kit

The documentation is generated automatically by Sphinx (with Read-the-Docs theme), and it's hosted using GitHub Pages at deeprob-kit.

Datasets and Experiments

A collection of 29 binary datasets, which most of them are used in Probabilistic Circuits literature, can be found at UCLA-StarAI-Binary-Datasets.

Moreover, a collection of 5 continuous datasets, commonly present in works regarding Normalizing Flows, can be found at MAF-Continuous-Datasets.

After downloading them, the datasets must be stored in the experiments/datasets directory to be able to run the experiments (and Unit Tests). The experiments scripts are available in the experiments directory and can be launched using the command line by specifying the dataset and hyper-parameters.

Code Examples

A collection of code examples can be found in the examples directory. However, the examples are not intended to produce state-of-the-art results, but only to present the library.

The following table contains a description about them and a code complexity ranging from one to three stars. The Complexity column consists of a measure that roughly represents how many features of the library are used, as well as the expected time required to run the script.

Example Description Complexity
naive_model.py Learn, evaluate and print statistics about a naive factorized model.
spn_plot.py Instantiate, prune, marginalize and plot some SPNs.
clt_plot.py Learn a Binary CLT and plot it.
spn_moments.py Instantiate and compute moments statistics about the random variables.
sklearn_interface.py Learn and evaluate a SPN using the scikit-learn interface.
spn_custom_leaf.py Learn, evaluate and serialize a SPN with a user-defined leaf distribution.
clt_to_spn.py Learn a Binary CLT, convert it to a structured decomposable SPN and plot it.
spn_clt_em.py Instantiate a SPN with Binary CLTs, apply EM algorithm and sample some data.
clt_queries.py Learn a Binary CLT, plot it, run some queries and sample some data.
ratspn_mnist.py Train and evaluate a RAT-SPN on MNIST.
dgcspn_olivetti.py Train, evaluate and complete some images with DGC-SPN on Olivetti-Faces.
dgcspn_mnist.py Train and evaluate a DGC-SPN on MNIST.
nvp1d_moons.py Train and evaluate a 1D RealNVP on Moons dataset.
maf_cifar10.py Train and evaluate a MAF on CIFAR10.
nvp2d_mnist.py Train and evaluate a 2D RealNVP on MNIST.
nvp2d_cifar10.py Train and evaluate a 2D RealNVP on CIFAR10.
spn_latent_mnist.py Train and evaluate a SPN on MNIST using the features extracted by an autoencoder.

Related Repositories

References

1. Peharz et al. On Theoretical Properties of Sum-Product Networks. AISTATS (2015).

2. Poon and Domingos. Sum-Product Networks: A New Deep Architecture. UAI (2011).

3. Molina, Vergari et al. Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains. AAAI (2018).

4. Molina, Vergari et al. SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks. CoRR (2019).

5. Peharz et al. Probabilistic Deep Learning using Random Sum-Product Networks. UAI (2020).

6. Papamakarios et al. Masked Autoregressive Flow for Density Estimation. NeurIPS (2017).

7. Dinh et al. Density Estimation using RealNVP. ICLR (2017).

8. Dinh et al. NICE: Non-linear Independent Components Estimation. ICLR (2015).

9. Papamakarios, Nalisnick et al. Normalizing Flows for Probabilistic Modeling and Inference. JMLR (2021).

10. Van de Wolfshaar and Pronobis. Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations. PGM (2020).

11. Rahman et al. Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees. ECML-PKDD (2014).

12. Di Mauro, Gala et al. Random Probabilistic Circuits. UAI (2021).

13. Desana and Schnörr. Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization. CoRR (2016).

14. Peharz et al. Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits. ICML (2020).

Owner
DeeProb-org
DeeProb-org
Pytorch Implementation of Zero-Shot Image-to-Text Generation for Visual-Semantic Arithmetic

Pytorch Implementation of Zero-Shot Image-to-Text Generation for Visual-Semantic Arithmetic [Paper] [Colab is coming soon] Approach Example Usage To r

170 Jan 03, 2023
Use unsupervised and supervised learning to predict stocks

AIAlpha: Multilayer neural network architecture for stock return prediction This project is meant to be an advanced implementation of stacked neural n

Vivek Palaniappan 1.5k Jan 06, 2023
Recurrent Scale Approximation (RSA) for Object Detection

Recurrent Scale Approximation (RSA) for Object Detection Codebase for Recurrent Scale Approximation for Object Detection in CNN published at ICCV 2017

Yu Liu (Louis) 239 Dec 28, 2022
TLDR; Train custom adaptive filter optimizers without hand tuning or extra labels.

AutoDSP TLDR; Train custom adaptive filter optimizers without hand tuning or extra labels. About Adaptive filtering algorithms are commonplace in sign

Jonah Casebeer 48 Sep 19, 2022
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations

Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations Code repo for paper Trans-Encoder: Unsupervised sentence-pa

Amazon 101 Dec 29, 2022
Transferable Unrestricted Attacks, which won 1st place in CVPR’21 Security AI Challenger: Unrestricted Adversarial Attacks on ImageNet.

Transferable Unrestricted Adversarial Examples This is the PyTorch implementation of the Arxiv paper: Towards Transferable Unrestricted Adversarial Ex

equation 16 Dec 29, 2022
https://sites.google.com/cornell.edu/recsys2021tutorial

Counterfactual Learning and Evaluation for Recommender Systems (RecSys'21 Tutorial) Materials for "Counterfactual Learning and Evaluation for Recommen

yuta-saito 45 Nov 10, 2022
DeceFL: A Principled Decentralized Federated Learning Framework

DeceFL: A Principled Decentralized Federated Learning Framework This repository comprises codes that reproduce experiments in Ye, et al (2021), which

Huazhong Artificial Intelligence Lab (HAIL) 10 May 31, 2022
PyTorch implementation of neural style randomization for data augmentation

README Augment training images for deep neural networks by randomizing their visual style, as described in our paper: https://arxiv.org/abs/1809.05375

84 Nov 23, 2022
Adversarial Graph Representation Adaptation for Cross-Domain Facial Expression Recognition (AGRA, ACM 2020, Oral)

Cross Domain Facial Expression Recognition Benchmark Implementation of papers: Cross-Domain Facial Expression Recognition: A Unified Evaluation Benchm

89 Dec 09, 2022
How to use TensorLayer

How to use TensorLayer While research in Deep Learning continues to improve the world, we use a bunch of tricks to implement algorithms with TensorLay

zhangrui 349 Dec 07, 2022
Image classification for projects and researches

This is a tool to help you quickly solve classification problems including: data analysis, training, report results and model explanation.

Nguyễn Trường Lâu 2 Dec 27, 2021
Hide screen when boss is approaching.

BossSensor Hide your screen when your boss is approaching. Demo The boss stands up. He is approaching. When he is approaching, the program fetches fac

Hiroki Nakayama 6.2k Jan 07, 2023
Learning to Prompt for Continual Learning

Learning to Prompt for Continual Learning (L2P) Official Jax Implementation L2P is a novel continual learning technique which learns to dynamically pr

Google Research 207 Jan 06, 2023
CoaT: Co-Scale Conv-Attentional Image Transformers

CoaT: Co-Scale Conv-Attentional Image Transformers Introduction This repository contains the official code and pretrained models for CoaT: Co-Scale Co

mlpc-ucsd 191 Dec 03, 2022
This repo contains code to reproduce all experiments in Equivariant Neural Rendering

Equivariant Neural Rendering This repo contains code to reproduce all experiments in Equivariant Neural Rendering by E. Dupont, M. A. Bautista, A. Col

Apple 83 Nov 16, 2022
YOLOv4-v3 Training Automation API for Linux

This repository allows you to get started with training a state-of-the-art Deep Learning model with little to no configuration needed! You provide your labeled dataset or label your dataset using our

BMW TechOffice MUNICH 626 Dec 31, 2022
Official PyTorch implementation of UACANet: Uncertainty Aware Context Attention for Polyp Segmentation

UACANet: Uncertainty Aware Context Attention for Polyp Segmentation Official pytorch implementation of UACANet: Uncertainty Aware Context Attention fo

Taehun Kim 85 Dec 14, 2022
Reliable probability face embeddings

ProbFace, arxiv This is a demo code of training and testing [ProbFace] using Tensorflow. ProbFace is a reliable Probabilistic Face Embeddging (PFE) me

Kaen Chan 34 Dec 31, 2022
CoSMA: Convolutional Semi-Regular Mesh Autoencoder. From Paper "Mesh Convolutional Autoencoder for Semi-Regular Meshes of Different Sizes"

Mesh Convolutional Autoencoder for Semi-Regular Meshes of Different Sizes Implementation of CoSMA: Convolutional Semi-Regular Mesh Autoencoder arXiv p

Fraunhofer SCAI 10 Oct 11, 2022