Implementation of the paper "Generating Symbolic Reasoning Problems with Transformer GANs"

Related tags

Deep LearningTGAN-SR
Overview

Generating Symbolic Reasoning Problems with Transformer GANs

This is the implementation of the paper Generating Symbolic Reasoning Problems with Transformer GANs.

Constructing training data for symbolic reasoning domains is challenging: On the one hand existing instances are typically hand-crafted and too few to be trained on directly, on the other hand synthetically generated instances are often hard to evaluate in terms of their meaningfulness.

We provide a GAN and a Wasserstein GAN equipped with Transformer encoders to generate sensible and challenging training data for symbolic reasoning domains. Even without autoregression, the GAN models produce syntactically correct problem instances. The generated data can be used as a substitute for real training data, and, especially, the training data can be generated from a real data set that is too small to be trained on directly.

For example, the models produced the following correct mathematical expressions:

and the following correct Linear-time Temporal Logic (LTL) formulas used in verification:

Installation

The code is shipped as a Python package that can be installed by executing

pip install -e .

in the impl directory (where setup.py is located). Python version 3.6 or higher is required. Additional dependencies such as tensorflow will be installed automatically. To generate datasets or solve instances immediately after generation, the LTL satisfiability checking tool aalta is required as binary. It can be obtained from bitbucket (earliest commit in that repository). After compiling, ensure that the binary aalta resides under the bin folder.

Datasets

A zip file containing our original datasets can be downloaded from here. Unpack its contents to the datasets directory.

Dataset generation

Alternatively, datasets can be generated from scratch. The following procedure describes how to construct a dataset similar to the main base dataset (LTLbase):

First, generate a raw dataset by

python -m tgan_sr.data_generation.generator -od datasets/LTLbase --splits all_raw:1 --timeout 2 -nv 10 -ne 1600000 -ts 50 --log-each-x-percent 1 --frac-unsat None

(possibly rename to not override the supplied dataset). Enter the newly created directory.

Optional: Visualize the dataset (like Figures 5 and 6 in the paper)

python -m tgan_sr.utils.analyze_dataset all_raw.txt formula,sat

To filter the dataset for duplicates and balance classes per size

python -m tgan_sr.utils.update_dataset all_raw.txt unique - | python -m tgan_sr.utils.update_dataset - balance_per_size all_balanced.txt

Optional: Calculate relaxed satisfiability

python -m tgan_sr.utils.update_dataset all_balanced.txt relaxed_sat all_balanced_rs.txt

Optional: Visualize the dataset (like Figures 7 and 8 in the paper)

python -m tgan_sr.utils.analyze_dataset all_balanced_rs.txt formula,sat+relaxed

Split the data into training and validation sets

python -m tgan_sr.utils.update_dataset all_balanced_rs.txt shuffle+split=train:8,val:1,test:1

Experiments (training)

The folder configs contains JSON files for each type of experiment in the paper. Settings for different hyperparameters can be easily adjusted.

A model can be trained like this:

python -m tgan_sr.train.gan --run-name NAME --params-file configs/CONFIG.json

During training, relevant metrics will be logged to train_custom in the run's directory and can be viewed with tensorboard afterwards.

A list of all configurations and corresponding JSON files:

  • Standard WGAN: wgan_gp10_nl6-4_nc2_bs1024.json
  • Standard GAN: gan_nl6-4_nc2_bs1024.json
  • different σ for added noise: add parameter "gan_sigma_real" and assign desired value
  • WGAN on 10K-sized base dataset: n10k_wgan_gp10_nl6-4_nc2_bs512.json
  • Sample data from the trained WGAN: sample_n10k_wgan_gp10_nl6-4_nc2_bs512.json (ensure the "load_from" field matches your trained run name)
  • Classifier on default dataset: class_nl4_bs1024.json
  • Classifier on generated dataset: class_Generated_nl4_bs1024.json
  • WGAN with included classifier: wgan+class_nl6-3s1_nc2_bs1024.json
  • WGAN with absolute uncertainty objective: wgan+class+uncert-abs_nl6-3s1_nc2_bs1024.json (ensure the "looad_from" field matches your pre-trained name)
  • WGAN with entropy uncertainty objective: wgan+class+uncert-entr_nl6-3s1_nc2_bs1024.json (ensure the "looad_from" field matches your pre-trained name)
  • Sample data from the trained WGAN with entropy uncertainty objective: sample_wgan+class+uncert-entr_nl6-3s1_nc2_bs1024.json (ensure the "load_from" field matches your trained run name)

Evaluation

To test a trained classifier on an arbitrary dataset (validation):

python -m tgan_sr.train.gan --run-name NAME --test --ds-name DATASET_NAME

The model will be automatically loaded from the latest checkpoint in the run's directory.

How to Cite

@article{TGAN-SR,
    title = {Generating Symbolic Reasoning Problems with Transformer GANs},
    author = {Kreber, Jens U and Hahn, Christopher},
    journal = {arXiv preprint},
    year = {2021}
}
Owner
Reactive Systems Group
Saarland University
Reactive Systems Group
Like ThreeJS but for Python and based on wgpu

pygfx A render engine, inspired by ThreeJS, but for Python and targeting Vulkan/Metal/DX12 (via wgpu). Introduction This is a Python render engine bui

139 Jan 07, 2023
Deploy recommendation engines with Edge Computing

RecoEdge: Bringing Recommendations to the Edge A one stop solution to build your recommendation models, train them and, deploy them in a privacy prese

NimbleEdge 131 Jan 02, 2023
NovelD: A Simple yet Effective Exploration Criterion

NovelD: A Simple yet Effective Exploration Criterion Intro This is an implementation of the method proposed in NovelD: A Simple yet Effective Explorat

29 Dec 05, 2022
A voice recognition assistant similar to amazon alexa, siri and google assistant.

kenyan-Siri Build an Artificial Assistant Full tutorial (video) To watch the tutorial, click on the image below Installation For windows users (run th

Alison Parker 3 Aug 19, 2022
PPLNN is a Primitive Library for Neural Network is a high-performance deep-learning inference engine for efficient AI inferencing

PPLNN is a Primitive Library for Neural Network is a high-performance deep-learning inference engine for efficient AI inferencing

943 Jan 07, 2023
Cryptocurrency Prediction with Artificial Intelligence (Deep Learning via LSTM Neural Networks)

Cryptocurrency Prediction with Artificial Intelligence (Deep Learning via LSTM Neural Networks)- Emirhan BULUT

Emirhan BULUT 102 Nov 18, 2022
EdiBERT is a generative model based on a bi-directional transformer, suited for image manipulation

EdiBERT, a generative model for image editing EdiBERT is a generative model based on a bi-directional transformer, suited for image manipulation. The

16 Dec 07, 2022
A fast python implementation of Ray Tracing in One Weekend using python and Taichi

ray-tracing-one-weekend-taichi A fast python implementation of Ray Tracing in One Weekend using python and Taichi. Taichi is a simple "Domain specific

157 Dec 26, 2022
Implementation of the paper "Shapley Explanation Networks"

Shapley Explanation Networks Implementation of the paper "Shapley Explanation Networks" at ICLR 2021. Note that this repo heavily uses the experimenta

68 Dec 27, 2022
A naive ROS interface for visualDet3D.

YOLO3D ROS Node This repo contains a Monocular 3D detection Ros node. Base on https://github.com/Owen-Liuyuxuan/visualDet3D All parameters are exposed

Yuxuan Liu 19 Oct 08, 2022
The Official Implementation of the ICCV-2021 Paper: Semantically Coherent Out-of-Distribution Detection.

SCOOD-UDG (ICCV 2021) This repository is the official implementation of the paper: Semantically Coherent Out-of-Distribution Detection Jingkang Yang,

Jake YANG 62 Nov 21, 2022
Class activation maps for your PyTorch models (CAM, Grad-CAM, Grad-CAM++, Smooth Grad-CAM++, Score-CAM, SS-CAM, IS-CAM, XGrad-CAM, Layer-CAM)

TorchCAM: class activation explorer Simple way to leverage the class-specific activation of convolutional layers in PyTorch. Quick Tour Setting your C

F-G Fernandez 1.2k Dec 29, 2022
Next-gen Rowhammer fuzzer that uses non-uniform, frequency-based patterns.

Blacksmith Rowhammer Fuzzer This repository provides the code accompanying the paper Blacksmith: Scalable Rowhammering in the Frequency Domain that is

Computer Security Group @ ETH Zurich 173 Nov 16, 2022
Automatic 2D-to-3D Video Conversion with CNNs

Deep3D: Automatic 2D-to-3D Video Conversion with CNNs How To Run To run this code. Please install MXNet following the official document. Deep3D requir

Eric Junyuan Xie 1.2k Dec 30, 2022
Deep Learning to Create StepMania SM FIles

StepCOVNet Running Audio to SM File Generator Currently only produces .txt files. Use SMDataTools to convert .txt to .sm python stepmania_note_generat

Chimezie Iwuanyanwu 8 Jan 08, 2023
Code for the paper "VisualBERT: A Simple and Performant Baseline for Vision and Language"

This repository contains code for the following two papers: VisualBERT: A Simple and Performant Baseline for Vision and Language (arxiv) with a short

Natural Language Processing @UCLA 463 Dec 09, 2022
DC3: A Learning Method for Optimization with Hard Constraints

DC3: A learning method for optimization with hard constraints This repository is by Priya L. Donti, David Rolnick, and J. Zico Kolter and contains the

CMU Locus Lab 57 Dec 26, 2022
Physics-informed Neural Operator for Learning Partial Differential Equation

PINO Physics-informed Neural Operator for Learning Partial Differential Equation Abstract: Machine learning methods have recently shown promise in sol

107 Jan 02, 2023
[CVPR 2020] Interpreting the Latent Space of GANs for Semantic Face Editing

InterFaceGAN - Interpreting the Latent Space of GANs for Semantic Face Editing Figure: High-quality facial attributes editing results with InterFaceGA

GenForce: May Generative Force Be with You 1.3k Dec 29, 2022
This repository contains code to train and render Mixture of Volumetric Primitives (MVP) models

Mixture of Volumetric Primitives -- Training and Evaluation This repository contains code to train and render Mixture of Volumetric Primitives (MVP) m

Meta Research 125 Dec 29, 2022