Bayesian dessert for Lasagne

Overview

Gelato

Coverage Status

Bayesian dessert for Lasagne

Recent results in Bayesian statistics for constructing robust neural networks have proved that it is one of the best ways to deal with uncertainty, overfitting but still having good performance. Gelato will help to use bayes for neural networks. Library heavily relies on Theano, Lasagne and PyMC3.

Installation

  • from github (assumes bleeding edge pymc3 installed)
    # pip install git+git://github.com/pymc-devs/pymc3.git
    pip install git+https://github.com/ferrine/gelato.git
  • from source
    git clone https://github.com/ferrine/gelato
    pip install -r gelato/requirements.txt
    pip install -e gelato

Usage

I use generic approach for decorating all Lasagne at once. Thus, for using Gelato you need to replace import statements for layers only. For constructing a network you need to be the in pm.Model context environment.

Warning

  • lasagne.layers.noise is not supported
  • lasagne.layers.normalization is not supported (theano problems with default updates)
  • functions from lasagne.layers are hidden in gelato as they use Lasagne classes. Some exceptions are done for lasagne.layers.helpers. I'll try to solve the problem generically in future.

Examples

For comprehensive example of using Gelato you can reference this notebook

Life Hack

Any spec class can be used standalone so feel free to use it everywhere

References

Charles Blundell et al: "Weight Uncertainty in Neural Networks" (arXiv preprint arXiv:1505.05424)

You might also like...
Bayesian optimization in PyTorch

BoTorch is a library for Bayesian Optimization built on PyTorch. BoTorch is currently in beta and under active development! Why BoTorch ? BoTorch Prov

Safe Bayesian Optimization
Safe Bayesian Optimization

SafeOpt - Safe Bayesian Optimization This code implements an adapted version of the safe, Bayesian optimization algorithm, SafeOpt [1], [2]. It also p

Bayesian Optimization using GPflow

Note: This package is for use with GPFlow 1. For Bayesian optimization using GPFlow 2 please see Trieste, a joint effort with Secondmind. GPflowOpt GP

Code for
Code for "Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations"

Infinitely Deep Bayesian Neural Networks with SDEs This library contains JAX and Pytorch implementations of neural ODEs and Bayesian layers for stocha

(under submission) Bayesian Integration of a Generative Prior for Image Restoration
(under submission) Bayesian Integration of a Generative Prior for Image Restoration

BIGPrior: Towards Decoupling Learned Prior Hallucination and Data Fidelity in Image Restoration Authors: Majed El Helou, and Sabine Süsstrunk {Note: p

PClean: A Domain-Specific Probabilistic Programming Language for Bayesian Data Cleaning

PClean: A Domain-Specific Probabilistic Programming Language for Bayesian Data Cleaning Warning: This is a rapidly evolving research prototype.

Bayesian Image Reconstruction using Deep Generative Models
Bayesian Image Reconstruction using Deep Generative Models

Bayesian Image Reconstruction using Deep Generative Models R. Marinescu, D. Moyer, P. Golland For technical inquiries, please create a Github issue. F

Few-shot Relation Extraction via Bayesian Meta-learning on Relation Graphs

Few-shot Relation Extraction via Bayesian Meta-learning on Relation Graphs This is an implemetation of the paper Few-shot Relation Extraction via Baye

Supporting code for the paper
Supporting code for the paper "Dangers of Bayesian Model Averaging under Covariate Shift"

Dangers of Bayesian Model Averaging under Covariate Shift This repository contains the code to reproduce the experiments in the paper Dangers of Bayes

Comments
  • Exception in example NB

    Exception in example NB

    I'm up-to-date on pymc3 and gelato.

    ---------------------------------------------------------------------------
    AttributeError                            Traceback (most recent call last)
    /Users/twiecki/anaconda/lib/python3.6/site-packages/theano/gof/op.py in __call__(self, *inputs, **kwargs)
        624                 try:
    --> 625                     storage_map[ins] = [self._get_test_value(ins)]
        626                     compute_map[ins] = [True]
    
    /Users/twiecki/anaconda/lib/python3.6/site-packages/theano/gof/op.py in _get_test_value(cls, v)
        580         detailed_err_msg = utils.get_variable_trace_string(v)
    --> 581         raise AttributeError('%s has no test value %s' % (v, detailed_err_msg))
        582 
    
    AttributeError: Softmax.0 has no test value  
    Backtrace when that variable is created:
    
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/ipykernel/zmqshell.py", line 533, in run_cell
        return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2717, in run_cell
        interactivity=interactivity, compiler=compiler, result=result)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2821, in run_ast_nodes
        if self.run_code(code, result):
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2881, in run_code
        exec(code_obj, self.user_global_ns, self.user_ns)
      File "<ipython-input-18-7dd01309b711>", line 37, in <module>
        prediction = gelato.layers.get_output(network)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/lasagne/layers/helper.py", line 190, in get_output
        all_outputs[layer] = layer.get_output_for(layer_inputs, **kwargs)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/lasagne/layers/dense.py", line 124, in get_output_for
        return self.nonlinearity(activation)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/lasagne/nonlinearities.py", line 44, in softmax
        return theano.tensor.nnet.softmax(x)
    
    
    During handling of the above exception, another exception occurred:
    
    ValueError                                Traceback (most recent call last)
    <ipython-input-18-7dd01309b711> in <module>()
         44                    prediction,
         45                    observed=target_var,
    ---> 46                    total_size=total_size)
    
    /Users/twiecki/working/projects/pymc/pymc3/distributions/distribution.py in __new__(cls, name, *args, **kwargs)
         35                 raise TypeError("observed needs to be data but got: {}".format(type(data)))
         36             total_size = kwargs.pop('total_size', None)
    ---> 37             dist = cls.dist(*args, **kwargs)
         38             return model.Var(name, dist, data, total_size)
         39         else:
    
    /Users/twiecki/working/projects/pymc/pymc3/distributions/distribution.py in dist(cls, *args, **kwargs)
         46     def dist(cls, *args, **kwargs):
         47         dist = object.__new__(cls)
    ---> 48         dist.__init__(*args, **kwargs)
         49         return dist
         50 
    
    /Users/twiecki/working/projects/pymc/pymc3/distributions/discrete.py in __init__(self, p, *args, **kwargs)
        429         super(Categorical, self).__init__(*args, **kwargs)
        430         try:
    --> 431             self.k = tt.shape(p)[-1].tag.test_value
        432         except AttributeError:
        433             self.k = tt.shape(p)[-1]
    
    /Users/twiecki/anaconda/lib/python3.6/site-packages/theano/gof/op.py in __call__(self, *inputs, **kwargs)
        637                         raise ValueError(
        638                             'Cannot compute test value: input %i (%s) of Op %s missing default value. %s' %
    --> 639                             (i, ins, node, detailed_err_msg))
        640                     elif config.compute_test_value == 'ignore':
        641                         # silently skip test
    
    ValueError: Cannot compute test value: input 0 (Softmax.0) of Op Shape(Softmax.0) missing default value.  
    Backtrace when that variable is created:
    
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/ipykernel/zmqshell.py", line 533, in run_cell
        return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2717, in run_cell
        interactivity=interactivity, compiler=compiler, result=result)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2821, in run_ast_nodes
        if self.run_code(code, result):
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2881, in run_code
        exec(code_obj, self.user_global_ns, self.user_ns)
      File "<ipython-input-18-7dd01309b711>", line 37, in <module>
        prediction = gelato.layers.get_output(network)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/lasagne/layers/helper.py", line 190, in get_output
        all_outputs[layer] = layer.get_output_for(layer_inputs, **kwargs)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/lasagne/layers/dense.py", line 124, in get_output_for
        return self.nonlinearity(activation)
      File "/Users/twiecki/anaconda/lib/python3.6/site-packages/lasagne/nonlinearities.py", line 44, in softmax
        return theano.tensor.nnet.softmax(x)
    
    opened by twiecki 12
  • Integrate opvi

    Integrate opvi

    I'm currently integrating recent changes in PyMC3 to gelato. There are a lot of changes. Everyone is welcome for discussion.

    Here are the most remarkable features:

    • no more with context when using gelato layers
    from gelato.layers import *
    import pymc3 as pm
    # get data somehow
    inp = InputLayer(shape)
    out = DenseLayer(inp, 1, W=NormalSpec(sd=LognormalSpec(sd=.1)))
    out = DenseLayer(out, 1, W=NormalSpec(sd=LognormalSpec(sd=.1)))
    with out.root:
        pm.Normal('y', mu=get_output(out, {inp:x}),
                  observed=y)
        approx = pm.fit(10000)
    
    • Flexible Specs you can do almost everything. What to do if we want different shapes there is an open question
    from gelato import *
    import theano.tensor as tt
    import pymc3 as pm
    func = as_spec_op(tt.nlinalg.matrix_power)
    expr0= func(NormalSpec() * LaplaceSpec(), 2)
    expr1 = expr0 / 100 - NormalSpec()
    with Model() as model:
        var = expr((10, 10))
        assert var.tag.test_value.shape == (10, 10)
        assert len(model.free_RVs) == 3
        fit(100)
    U = NormalSpec()
    V = UniformSpec()
    V = V / V.norm(2)
    W = U*V
    with pm.Model() as model:
        result = W((3, 2), name='weight_normalization')
    
    opened by ferrine 2
  • Fix example

    Fix example

    refere to #7. I've updated example using new pm.Minibatch API. All was running good with the following theanorc:

    [global]
    device=cpu
    floatX=float32
    mode=FAST_RUN
    optimizer_including=cudnn
    
    [lib]
    cnmem=0.95
    
    [nvcc]
    fastmath=True
    flags = -I/usr/local/cuda-8.0-cudnnv5.1/include -L/usr/local/cuda-8.0-cudnnv5.1/lib64
    
    [blas]
    ldflag = -L/usr/lib/openblas-base -Lusr/local/cuda-8.0-cudnnv5.1/lib64 -lopenblas
    
    [DebugMode]
    check_finite=1
    
    [cuda]
    root=/usr/local/cuda-8.0-cudnnv5.1/
    

    pip freeze output

    alabaster==0.7.10
    algopy==0.5.3
    Babel==2.4.0
    bleach==2.0.0
    CommonMark==0.5.4
    cycler==0.10.0
    Cython==0.25.2
    decorator==4.0.11
    docutils==0.13.1
    entrypoints==0.2.2
    -e git+https://github.com/ferrine/[email protected]#egg=gelato
    h5py==2.7.0
    html5lib==0.999999999
    imagesize==0.7.1
    ipykernel==4.6.1
    ipython==6.0.0
    ipython-genutils==0.2.0
    ipywidgets==6.0.0
    Jinja2==2.9.6
    joblib==0.11
    jsonschema==2.6.0
    jupyter==1.0.0
    jupyter-client==5.0.1
    jupyter-console==5.1.0
    jupyter-core==4.3.0
    Keras==2.0.4
    Lasagne==0.2.dev1
    Mako==1.0.6
    MarkupSafe==1.0
    matplotlib==2.0.0
    mistune==0.7.4
    more-itertools==3.1.0
    nbconvert==5.1.1
    nbformat==4.3.0
    nbsphinx==0.2.13
    nose==1.3.7
    notebook==5.0.0
    numdifftools==0.9.20
    numpy==1.13.0
    pandas==0.20.1
    pandocfilters==1.4.1
    patsy==0.4.1
    pexpect==4.2.1
    pickleshare==0.7.4
    prompt-toolkit==1.0.14
    ptyprocess==0.5.1
    Pygments==2.2.0
    pygpu==0.6.5
    -e git+https://github.com/ferrine/[email protected]#egg=pymc3
    pymongo==3.4.0
    pyparsing==2.2.0
    python-dateutil==2.6.0
    pytz==2017.2
    PyYAML==3.12
    pyzmq==16.0.2
    qtconsole==4.3.0
    recommonmark==0.4.0
    requests==2.13.0
    scikit-learn==0.18.1
    scipy==0.19.1
    seaborn==0.7.1
    simplegeneric==0.8.1
    six==1.10.0
    sklearn==0.0
    snowballstemmer==1.2.1
    Sphinx==1.5.5
    terminado==0.6
    testpath==0.3
    Theano==0.10.0.dev1
    tornado==4.5.1
    tqdm==4.11.2
    traitlets==4.3.2
    wcwidth==0.1.7
    webencodings==0.5.1
    widgetsnbextension==2.0.0
    xmltodict==0.11.0
    
    opened by ferrine 0
  • Not compatible with latest version of pymc3

    Not compatible with latest version of pymc3

    When I attempt to import gelato, it fails with the following error message:

    ---> 19 class LayerModelMeta(pm.model.InitContextMeta):
         20     """Magic comes here
         21     """
    
    AttributeError: module 'pymc3.model' has no attribute 'InitContextMeta'
    

    I believe that InitContextMeta no longer exists in pymc3; it's been merged with ContextMeta.

    I don't know if there are plans to update this repository anytime soon, although it does seem like a useful tool, so it would be great if it worked with the latest pymc3.

    opened by quevivasbien 2
Releases(v0.1.0)
Owner
Maxim Kochurov
Researcher @ NTechLab; MSU/Skoltech; Core Dev @ PyMC3, Geoopt
Maxim Kochurov
PyContinual (An Easy and Extendible Framework for Continual Learning)

PyContinual (An Easy and Extendible Framework for Continual Learning) Easy to Use You can sumply change the baseline, backbone and task, and then read

176 Jan 05, 2023
BRepNet: A topological message passing system for solid models

BRepNet: A topological message passing system for solid models This repository contains the an implementation of BRepNet: A topological message passin

Autodesk AI Lab 42 Dec 30, 2022
Bottleneck Transformers for Visual Recognition

Bottleneck Transformers for Visual Recognition Experiments Model Params (M) Acc (%) ResNet50 baseline (ref) 23.5M 93.62 BoTNet-50 18.8M 95.11% BoTNet-

Myeongjun Kim 236 Jan 03, 2023
Plover-tapey-tape: an alternative to Plover’s built-in paper tape

plover-tapey-tape plover-tapey-tape is an alternative to Plover’s built-in paper

7 May 29, 2022
[EMNLP 2020] Keep CALM and Explore: Language Models for Action Generation in Text-based Games

Contextual Action Language Model (CALM) and the ClubFloyd Dataset Code and data for paper Keep CALM and Explore: Language Models for Action Generation

Princeton Natural Language Processing 43 Dec 16, 2022
EEGEyeNet is benchmark to evaluate ET prediction based on EEG measurements with an increasing level of difficulty

Introduction EEGEyeNet EEGEyeNet is a benchmark to evaluate ET prediction based on EEG measurements with an increasing level of difficulty. Overview T

Ard Kastrati 23 Dec 22, 2022
Python utility to generate filesystem content for Obsidian.

Security Vault Generator Quickly parse, format, and output common frameworks/content for Obsidian.md. There is a strong focus on MITRE ATT&CK because

Justin Angel 73 Dec 02, 2022
DeepMoCap: Deep Optical Motion Capture using multiple Depth Sensors and Retro-reflectors

DeepMoCap: Deep Optical Motion Capture using multiple Depth Sensors and Retro-reflectors By Anargyros Chatzitofis, Dimitris Zarpalas, Stefanos Kollias

tofis 24 Oct 08, 2022
Autonomous racing with the Anki Overdrive

Anki Autonomous Racing Autonomous racing with the Anki Overdrive. Using the Overdrive-Python API (https://github.com/xerodotc/overdrive-python) develo

3 Dec 11, 2022
LegoDNN: a block-grained scaling tool for mobile vision systems

Table of contents 1 Introduction 1.1 Major features 1.2 Architecture 2 Code and Installation 2.1 Code 2.2 Installation 3 Repository of DNNs in vision

41 Dec 24, 2022
Python Assignments for the Deep Learning lectures by Andrew NG on coursera with complete submission for grading capability.

Python Assignments for the Deep Learning lectures by Andrew NG on coursera with complete submission for grading capability.

Utkarsh Agiwal 1 Feb 03, 2022
Minimal diffusion models - Minimal code and simple experiments to play with Denoising Diffusion Probabilistic Models (DDPMs)

Minimal code and simple experiments to play with Denoising Diffusion Probabilist

Rithesh Kumar 16 Oct 06, 2022
Official implementation of SIGIR'2021 paper: "Sequential Recommendation with Graph Neural Networks".

SURGE: Sequential Recommendation with Graph Neural Networks This is our TensorFlow implementation for the paper: Sequential Recommendation with Graph

FIB LAB, Tsinghua University 53 Dec 26, 2022
MiraiML: asynchronous, autonomous and continuous Machine Learning in Python

MiraiML Mirai: future in japanese. MiraiML is an asynchronous engine for continuous & autonomous machine learning, built for real-time usage. Usage In

Arthur Paulino 25 Jul 27, 2022
A tool to analyze leveraged liquidity mining and find optimal option combination for hedging.

LP-Option-Hedging Description A Python program to analyze leveraged liquidity farming/mining and find the optimal option combination for hedging imper

Aureliano 18 Dec 19, 2022
Deep Learning for Time Series Forecasting.

nixtlats:Deep Learning for Time Series Forecasting [nikstla] (noun, nahuatl) Period of time. State-of-the-art time series forecasting for pytorch. Nix

Nixtla 5 Dec 06, 2022
Implementation of self-attention mechanisms for general purpose. Focused on computer vision modules. Ongoing repository.

Self-attention building blocks for computer vision applications in PyTorch Implementation of self attention mechanisms for computer vision in PyTorch

AI Summer 962 Dec 23, 2022
Library of various Few-Shot Learning frameworks for text classification

FewShotText This repository contains code for the paper A Neural Few-Shot Text Classification Reality Check Environment setup # Create environment pyt

Thomas Dopierre 47 Jan 03, 2023
Use AI to generate a optimized stock portfolio

Use AI, Modern Portfolio Theory, and Monte Carlo simulation's to generate a optimized stock portfolio that minimizes risk while maximizing returns. Ho

Greg James 30 Dec 22, 2022
Stock-history-display - something like a easy yearly review for your stock performance

Stock History Display Available on Heroku: https://stock-history-display.herokua

LiaoJJ 1 Jan 07, 2022