General-purpose program synthesiser

Overview

DeepSynth

General-purpose program synthesiser.

This is the repository for the code of the paper "Scaling Neural Program Synthesis with Distribution-based Search".

Authors: Anonymous

Figure

Abstract

We consider the problem of automatically constructing computer programs from input-output examples. We investigate how to augment probabilistic and neural program synthesis methods with new search algorithms, proposing a framework called distribution-based search. Within this framework, we introduce two new search algorithms: HEAP SEARCH, an enumerative method, and SQRT SAMPLING, a probabilistic method. We prove certain optimality guarantees for both methods, show how they integrate with probabilistic and neural techniques, and demonstrate how they can operate at scale across parallel compute environments. Collectively these findings offer theoretical and applied studies of search algorithms for program synthesis that integrate with recent developments in machine-learned program synthesizers.

Usage

Installation

# clone this repository
git clone https://github.com/nathanael-fijalkow/DeepSynth.git

# create your new env
conda create -n deep_synth python>=3.7 
# activate it
conda activate deep_synth
# install pip
yes | conda install pip
# install this package and the dependencies
pip install torch cython tqdm numpy matplotlib
pip install git+https://github.com/MaxHalford/vose
# For flashfill dataset
pip install sexpdata
# If you want to do the parallel experiments
pip install ray

# You are good to go :)
# To test your installation you can run the following tests:
python unit_test_algorithms.py
python unit_test_programs.py
python unit_test_algorithms.py
python unit_test_predictions.py
# Only if you installed ray
python unit_test_parallel.py

File structure

./
        Algorithms/      # the search algorithms + parallel pipeline
        DSL/             # DSL: dreamcoder, deepcoder, flashfill
        list_dataset/    # DreamCoder dataset in pickle format
        Predictions/     # all files related to the ANN for prediction of the grammars 

Reproducing the experiments

All of the files mentioned in this section are located in the root folder and follow this pattern run_*_experiments*.py.

Here is a short summary of each experiment:

  • run_random_PCFG_search.py produce a list of all programs generated under Xsec of search time by all algorithms.
  • run_random_PCFG_search_parallel.py same experiment but iwth the grammar_splitter and multiple CPUs.
  • run_experiments_ .py try to find solutions using an ANN to predict the grammar and for each algorithm logs the search data for the corresponding . The suffix parallel can also be found indicating that the algorithms are run in parallel. The semantics experiments in the paper used a trained model thatn can be obtained using produce_network.py or directly in the repository. The results can be plotted using plot_results_semantics.py.

Note that for the DreamCoder experiment in our paper, we did not use the cached evaluation of HeapSearch, this can be reproduced by setting use_heap_search_cached_eval to False in run_experiment.py.

Quick guide to using ANN to predict a grammar

Is it heavily inspired by the file model_loader.py.

First we create a prediction model:

############################
##### Hyperparameters ######
############################

max_program_depth = 4

size_max = 10  # maximum number of elements in a list (input or output)
nb_inputs_max = 2  # maximum number of inputs in an IO
lexicon = list(range(30))  # all elements of a list must be from lexicon
# only useful for VariableSizeEncoding
encoding_output_dimension = 30  # fixing the dimension

embedding_output_dimension = 10
# only useful for RNNEmbedding
number_layers_RNN = 1

size_hidden = 64

############################
######### PCFG #############
############################

deepcoder = DSL(semantics, primitive_types)
type_request = Arrow(List(INT), List(INT))
deepcoder_cfg = deepcoder.DSL_to_CFG(
    type_request, max_program_depth=max_program_depth)
deepcoder_pcfg = deepcoder_cfg.CFG_to_Uniform_PCFG()

############################
###### IO ENCODING #########
############################

# IO = [[I1, ...,Ik], O]
# I1, ..., Ik, O are lists
# IOs = [IO1, IO2, ..., IOn]
# task = (IOs, program)
# tasks = [task1, task2, ..., taskp]

#### Specification: #####
# IOEncoder.output_dimension: size of the encoding of one IO
# IOEncoder.lexicon_size: size of the lexicon
# IOEncoder.encode_IO: outputs a tensor of dimension IOEncoder.output_dimension
# IOEncoder.encode_IOs: inputs a list of IO of size n
# and outputs a tensor of dimension n * IOEncoder.output_dimension

IOEncoder = FixedSizeEncoding(
    nb_inputs_max=nb_inputs_max,
    lexicon=lexicon,
    size_max=size_max,
)


# IOEncoder = VariableSizeEncoding(
#     nb_inputs_max = nb_inputs_max,
#     lexicon = lexicon,
#     output_dimension = encoding_output_dimension,
#     )

############################
######### EMBEDDING ########
############################

# IOEmbedder = SimpleEmbedding(
#     IOEncoder=IOEncoder,
#     output_dimension=embedding_output_dimension,
#     size_hidden=size_hidden,
# )
 
IOEmbedder = RNNEmbedding(
    IOEncoder=IOEncoder,
    output_dimension=embedding_output_dimension,
    size_hidden=size_hidden,
    number_layers_RNN=number_layers_RNN,
)

#### Specification: #####
# IOEmbedder.output_dimension: size of the output of the embedder
# IOEmbedder.forward_IOs: inputs a list of IOs
# and outputs the embedding of the encoding of the IOs
# which is a tensor of dimension
# (IOEmbedder.input_dimension, IOEmbedder.output_dimension)
# IOEmbedder.forward: same but with a batch of IOs

############################
######### MODEL ############
############################

model = RulesPredictor(
    cfg=deepcoder_cfg,
    IOEncoder=IOEncoder,
    IOEmbedder=IOEmbedder,
    size_hidden=size_hidden,
)

# model = LocalRulesPredictor(
#     cfg = deepcoder_cfg,
#     IOEncoder = IOEncoder,
#     IOEmbedder = IOEmbedder,
#     # size_hidden = size_hidden,
#     )

Now we can produce the grammars:

dsl = DSL(semantics, primitive_types)
batched_grammars = model(batched_examples)
if isinstance(model, RulesPredictor):
    batched_grammars = model.reconstruct_grammars(batched_grammars)

Quick guide to train a neural network

Just copy the model initialisation used in your experiment in the file produce_network.py or use the ones provided that correspond to our experiments. You can change the hyperparameters, then run the script. A .weights file should appear at the root folder. This will train a neural network on random generated programs as described in Appendix F in the paper.

Quick guide to using a search algorithm for a grammar

There are already functions for that in run_experiment.py, namely run_algorithm and run_algorithm_parallel. The former enables you to run the specified algorithm in a single thread while the latter in parallel with a grammar splitter. To produce a is_correct function you can use make_program_checker in experiment_helper.py.

How to download the DeepCoder dataset?

First, download the archive from here (Deepcoder repo): https://storage.googleapis.com/deepcoder/dataset.tar.gz in a folder deepcoder_dataset at the root of DeepSynth. Then you simply need to:

gunzip dataset.tar.gz
tar -xf dataset.tar

You should see a few JSON files.

You might also like...
A simple python program that can be used to implement user authentication tokens into your program...

token-generator A simple python module that can be used by developers to implement user authentication tokens into your program... code examples creat

Worktory is a python library created with the single purpose of simplifying the inventory management of network automation scripts.

Worktory is a python library created with the single purpose of simplifying the inventory management of network automation scripts.

VGGFace2-HQ - A high resolution face dataset for face editing purpose
VGGFace2-HQ - A high resolution face dataset for face editing purpose

The first open source high resolution dataset for face swapping!!! A high resolution version of VGGFace2 for academic face editing purpose

MAME is a multi-purpose emulation framework.

MAME's purpose is to preserve decades of software history. As electronic technology continues to rush forward, MAME prevents this important "vintage" software from being lost and forgotten.

A general 3D Object Detection codebase in PyTorch.

Det3D is the first 3D Object Detection toolbox which provides off the box implementations of many 3D object detection algorithms such as PointPillars, SECOND, PIXOR, etc, as well as state-of-the-art methods on major benchmarks like KITTI(ViP) and nuScenes(CBGS).

Scikit-learn compatible estimation of general graphical models
Scikit-learn compatible estimation of general graphical models

skggm : Gaussian graphical models using the scikit-learn API In the last decade, learning networks that encode conditional independence relationships

(CVPR2021) ClassSR: A General Framework to Accelerate Super-Resolution Networks by Data Characteristic

ClassSR (CVPR2021) ClassSR: A General Framework to Accelerate Super-Resolution Networks by Data Characteristic Paper Authors: Xiangtao Kong, Hengyuan

Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch

Perceiver - Pytorch Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch Install $ pip install perceiver-pytorch Usage

Comments
  • Questions about the installation instructions.

    Questions about the installation instructions.

    Hi Nathanaël,

    I started to review your JOSS submission and have some questions about the installation part in the README.

    Quote the version specification

    conda create -n deep_synth python>=3.7 
    

    should be changed to the following, otherwise, it's not accepted by some shells such as zsh.

    conda create -n deep_synth "python>=3.7"
    

    How to install PyTorch

    I would recommend providing the compatible PyTorch version requirements and some potential commands to install the compatible versions (such as different CUDA/CPU versions). Since conda env is already created, one can also install PyTorch via conda.

    > pip install torch cython tqdm numpy matplotlib
    
    ERROR: Could not find a version that satisfies the requirement torch (from versions: none)
    ERROR: No matching distribution found for torch
    

    Missing pip package

    pip install scipy  # required by unit_tests_algorithms.py
    

    Correct the script names

    python unit_test_algorithms.py
    python unit_test_programs.py
    python unit_test_algorithms.py
    python unit_test_predictions.py
    # Only if you installed ray
    python unit_test_parallel.py
    

    The script name should be corrected.

    python unit_tests_algorithms.py
    python unit_tests_programs.py
    python unit_tests_algorithms.py
    python unit_tests_predictions.py
    

    Missing file for unit_test_parallel.py.

    Fail to run the tests

    > python unit_tests_algorithms.py
    Traceback (most recent call last):
      File "/myapps/research/synthesis/DeepSynth/unit_tests_algorithms.py", line 11, in <module>
        from dsl import DSL
      File "/myapps/research/synthesis/DeepSynth/dsl.py", line 6, in <module>
        from cfg import CFG
      File "/myapps/research/synthesis/DeepSynth/cfg.py", line 4, in <module>
        from pcfg_logprob import LogProbPCFG
      File "/myapps/research/synthesis/DeepSynth/pcfg_logprob.py", line 7, in <module>
        import vose
      File "/home/aplusplus/anaconda3/envs/deep_synth/lib/python3.9/site-packages/vose/__init__.py", line 1, in <module>
        from .sampler import Sampler
      File "vose/sampler.pyx", line 1, in init vose.sampler
    ValueError: numpy.ufunc size changed, may indicate binary incompatibility. Expected 232 from C header, got 216 from PyObject
    

    A specific package version may be needed.

    Best, Shengwei

    opened by njuaplusplus 5
Releases(joss-release)
  • joss-release(Oct 13, 2022)

    What's Changed

    • More documentation and addition of guide to use the software.
    • Install requirements by @bzz in https://github.com/nathanael-fijalkow/DeepSynth/pull/3
    Source code(tar.gz)
    Source code(zip)
Owner
Nathanaël Fijalkow
Computer science researcher
Nathanaël Fijalkow
Libraries, tools and tasks created and used at DeepMind Robotics.

Libraries, tools and tasks created and used at DeepMind Robotics.

DeepMind 270 Nov 30, 2022
Source code for paper "Document-Level Relation Extraction with Adaptive Thresholding and Localized Context Pooling", AAAI 2021

ATLOP Code for AAAI 2021 paper Document-Level Relation Extraction with Adaptive Thresholding and Localized Context Pooling. If you make use of this co

Wenxuan Zhou 146 Nov 29, 2022
Python periodic table module

elemenpy Hello! elements.py is a small Python periodic table module that is used for calling certain information about an element. Installation Instal

Eric Cheng 2 Dec 27, 2021
Time-series-deep-learning - Developing Deep learning LSTM, BiLSTM models, and NeuralProphet for multi-step time-series forecasting of stock price.

Stock Price Prediction Using Deep Learning Univariate Time Series Predicting stock price using historical data of a company using Neural networks for

Abdultawwab Safarji 7 Nov 27, 2022
Lung Pattern Classification for Interstitial Lung Diseases Using a Deep Convolutional Neural Network

ild-cnn This is supplementary material for the manuscript: "Lung Pattern Classification for Interstitial Lung Diseases Using a Deep Convolutional Neur

22 Nov 05, 2022
Video Instance Segmentation with a Propose-Reduce Paradigm (ICCV 2021)

Propose-Reduce VIS This repo contains the official implementation for the paper: Video Instance Segmentation with a Propose-Reduce Paradigm Huaijia Li

DV Lab 39 Nov 23, 2022
Implementation of EMNLP 2017 Paper "Natural Language Does Not Emerge 'Naturally' in Multi-Agent Dialog" using PyTorch and ParlAI

Language Emergence in Multi Agent Dialog Code for the Paper Natural Language Does Not Emerge 'Naturally' in Multi-Agent Dialog Satwik Kottur, José M.

Karan Desai 105 Nov 25, 2022
TransMorph: Transformer for Medical Image Registration

TransMorph: Transformer for Medical Image Registration keywords: Vision Transformer, Swin Transformer, convolutional neural networks, image registrati

Junyu Chen 180 Jan 07, 2023
A generator of point clouds dataset for PyPipes.

CloudPipesGenerator Documentation | Colab Notebooks | Video Tutorials | Master Degree website A generator of point clouds dataset for PyPipes. TODO Us

1 Jan 13, 2022
1st Solution For ICDAR 2021 Competition on Mathematical Formula Detection

This project releases our 1st place solution on ICDAR 2021 Competition on Mathematical Formula Detection. We implement our solution based on MMDetection, which is an open source object detection tool

yuxzho 94 Dec 25, 2022
Deep Ensemble Learning with Jet-Like architecture

Ransomware analysis using DEL with jet-like architecture comprising two CNN wings, a sparse AE tail, a non-linear PCA to produce a diverse feature space, and an MLP nose

Ahsen Nazir 2 Feb 06, 2022
This is an official implementation for "SimMIM: A Simple Framework for Masked Image Modeling".

Project This repo has been populated by an initial template to help get you started. Please make sure to update the content to build a great experienc

Microsoft 674 Dec 26, 2022
A fast python implementation of Ray Tracing in One Weekend using python and Taichi

ray-tracing-one-weekend-taichi A fast python implementation of Ray Tracing in One Weekend using python and Taichi. Taichi is a simple "Domain specific

157 Dec 26, 2022
[ICLR 2022 Oral] F8Net: Fixed-Point 8-bit Only Multiplication for Network Quantization

F8Net Fixed-Point 8-bit Only Multiplication for Network Quantization (ICLR 2022 Oral) OpenReview | arXiv | PDF | Model Zoo | BibTex PyTorch implementa

Snap Research 76 Dec 13, 2022
Python TFLite scripts for detecting objects of any class in an image without knowing their label.

Python TFLite scripts for detecting objects of any class in an image without knowing their label.

Ibai Gorordo 42 Oct 07, 2022
PyTorch implementation of SQN based on CloserLook3D's encoder

SQN_pytorch This repo is an implementation of Semantic Query Network (SQN) using CloserLook3D's encoder in Pytorch. For TensorFlow implementation, che

PointCloudYC 1 Oct 21, 2021
This repository contains code to train and render Mixture of Volumetric Primitives (MVP) models

Mixture of Volumetric Primitives -- Training and Evaluation This repository contains code to train and render Mixture of Volumetric Primitives (MVP) m

Meta Research 125 Dec 29, 2022
Source code release of the paper: Knowledge-Guided Deep Fractal Neural Networks for Human Pose Estimation.

GNet-pose Project Page: http://guanghan.info/projects/guided-fractal/ UPDATE 9/27/2018: Prototxts and model that achieved 93.9Pck on LSP dataset. http

Guanghan Ning 83 Nov 21, 2022
Structured Data Gradient Pruning (SDGP)

Structured Data Gradient Pruning (SDGP) Weight pruning is a technique to make Deep Neural Network (DNN) inference more computationally efficient by re

Bradley McDanel 10 Nov 11, 2022
Streamlit component for TensorBoard, TensorFlow's visualization toolkit

streamlit-tensorboard This is a work-in-progress, providing a function to embed TensorBoard, TensorFlow's visualization toolkit, in Streamlit apps. In

Snehan Kekre 27 Nov 13, 2022