Analysis of rationale selection in neural rationale models

Overview

Neural Rationale Interpretability Analysis

We analyze the neural rationale models proposed by Lei et al. (2016) and Bastings et al. (2019), as implemented in Interpretable Neural Predictions with Differentiable Binary Variables by Bastings et al. (2019). We have copied their original repository and build upon it with data perturbation analysis. Specifically, we implement a procedure to perturb sentences of the Stanford Sentiment Treebank (SST) data set and analyze the behavior of the models on the original and perturbed test sets.

Instructions

Installation

You need to have Python 3.6 or higher installed. First clone this repository.

Install all required Python packages using:

pip install -r requirements.txt

And finally download the data:

cd interpretable_predictions
./download_data_sst.sh

This will download the SST data (including filtered word embeddings).

Perturbed data and the model behavior on it is saved in data/sst/data_info.pickle, results/sst/latent_30pct/data_results.pickle, and results/sst/bernoulli_sparsity01505/data_results.pickle. To perform analysis on these, skip to the Plotting and Analysis section. To reproduce these results, continue as below.

Training on Stanford Sentiment Treebank (SST)

To train the latent (CR) rationale model to select 30% of text:

python -m latent_rationale.sst.train \
  --model latent --selection 0.3 --save_path results/sst/latent_30pct

To train the Bernoulli REINFORCE (PG) model with L0 penalty weight 0.01505:

python -m latent_rationale.sst.train \
  --model rl --sparsity 0.01505 --save_path results/sst/bernoulli_sparsity01505

Data Perturbation

To perform the data perturbation, run:

python -m latent_rationale.sst.perturb

This will save the data in data/sst/data_info.pickle.

Prediction and Rationale Selection

To run the latent model and get the rationale selection and prediction, run:

python -m latent_rationale.sst.predict_perturbed --ckpt results/sst/latent_30pct/

For the Bernoulli model, run:

python -m latent_rationale.sst.predict_perturbed --ckpt results/sst/bernoulli_sparsity01505/

These will save the rationale and prediction information in results/sst/latent_30pct/data_results.pickle and results/sst/bernoulli_sparsity01505/data_results.pickle for the two models, respectively.

Plotting and Analysis

To reconstruct the plots for the CR model, run:

python -m latent_rationale.sst.plots --ckpt results/sst/latent_30pct/

To run part of speech (POS) analysis for the CR model, run

python -m latent_rationale.sst.pos_analysis --ckpt results/sst/latent_30pct/

Perturbed Data Format

The perturbed data is stored as a dictionary where keys are indices (ranging from 0 to 2209, as the standard SST train/validation/test split has 2210 sentences). Each value is a dictionary with an original field, containing the original SST data instance, and a perturbed field which is a list of perturbed instances where each perturbed instance is a copy of the original instance but with one token substituted with a replacement. This is all saved in data/sst/data_info.pickle.

Owner
Yiming Zheng
Yiming Zheng
Code for Learning Manifold Patch-Based Representations of Man-Made Shapes, in ICLR 2021.

LearningPatches | Webpage | Paper | Video Learning Manifold Patch-Based Representations of Man-Made Shapes Dmitriy Smirnov, Mikhail Bessmeltsev, Justi

Dima Smirnov 22 Nov 14, 2022
All the essential resources and template code needed to understand and practice data structures and algorithms in python with few small projects to demonstrate their practical application.

Data Structures and Algorithms Python INDEX 1. Resources - Books Data Structures - Reema Thareja competitiveCoding Big-O Cheat Sheet DAA Syllabus Inte

Shushrut Kumar 129 Dec 15, 2022
Graph Self-Attention Network for Learning Spatial-Temporal Interaction Representation in Autonomous Driving

GSAN Introduction Code for paper GSAN: Graph Self-Attention Network for Learning Spatial-Temporal Interaction Representation in Autonomous Driving, wh

YE Luyao 6 Oct 27, 2022
Xi Dongbo 78 Nov 29, 2022
Decorator for PyMC3

sampled Decorator for reusable models in PyMC3 Provides syntactic sugar for reusable models with PyMC3. This lets you separate creating a generative m

Colin 50 Oct 08, 2021
Motion planning environment for Sampling-based Planners

Sampling-Based Motion Planners' Testing Environment Sampling-based motion planners' testing environment (sbp-env) is a full feature framework to quick

Soraxas 23 Aug 23, 2022
PyTorch Kafka Dataset: A definition of a dataset to get training data from Kafka.

PyTorch Kafka Dataset: A definition of a dataset to get training data from Kafka.

ERTIS Research Group 7 Aug 01, 2022
Accelerated SMPL operation, commonly used in generate 3D human mesh, STAR included.

SMPL2 An enchanced and accelerated SMPL operation which commonly used in 3D human mesh generation. It takes a poses, shapes, cam_trans as inputs, outp

JinTian 20 Oct 17, 2022
A Keras implementation of CapsNet in the paper: Sara Sabour, Nicholas Frosst, Geoffrey E Hinton. Dynamic Routing Between Capsules

NOTE This implementation is fork of https://github.com/XifengGuo/CapsNet-Keras , applied to IMDB texts reviews dataset. CapsNet-Keras A Keras implemen

Lauro Moraes 5 Oct 23, 2022
PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners for self-supervised ViT.

MAE for Self-supervised ViT Introduction This is an unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners for self-sup

36 Oct 30, 2022
PyTorch implementation of PP-LCNet

PP-LCNet-Pytorch Pre-Trained Models Google Drive p018 Accuracy Models Top1 Top5 PPLCNet_x0_25 0.5186 0.7565 PPLCNet_x0_35 0.5809 0.8083 PPLCNet_x0_5 0

24 Dec 12, 2022
GeoMol: Torsional Geometric Generation of Molecular 3D Conformer Ensembles

GeoMol: Torsional Geometric Generation of Molecular 3D Conformer Ensembles This repository contains a method to generate 3D conformer ensembles direct

127 Dec 20, 2022
Transformers are Graph Neural Networks!

🚀 Gated Graph Transformers Gated Graph Transformers for graph-level property prediction, i.e. graph classification and regression. Associated article

Chaitanya Joshi 46 Jun 30, 2022
Job Assignment System by Real-time Emotion Detection

Emotion-Detection Job Assignment System by Real-time Emotion Detection Emotion is the essential role of facial expression and it could provide a lot o

1 Feb 08, 2022
Visyerres sgdf woob - Modules Woob pour l'intranet et autres sites Scouts et Guides de France

Vis'Yerres SGDF - Modules Woob Vous avez le sentiment que l'intranet des Scouts

Thomas Touhey (pas un pseudonyme) 3 Dec 24, 2022
Training a deep learning model on the noisy CIFAR dataset

Training-a-deep-learning-model-on-the-noisy-CIFAR-dataset This repository contai

1 Jun 14, 2022
Dewarping Document Image By Displacement Flow Estimation with Fully Convolutional Network.

Dewarping Document Image By Displacement Flow Estimation with Fully Convolutional Network

111 Dec 27, 2022
A dataset for online Arabic calligraphy

Calliar Calliar is a dataset for Arabic calligraphy. The dataset consists of 2500 json files that contain strokes manually annotated for Arabic callig

ARBML 114 Dec 28, 2022
Rethinking Portrait Matting with Privacy Preserving

Rethinking Portrait Matting with Privacy Preserving This is the official repository of the paper Rethinking Portrait Matting with Privacy Preserving.

184 Jan 03, 2023
Task-related Saliency Network For Few-shot learning

Task-related Saliency Network For Few-shot learning This is an official implementation in Tensorflow of TRSN. Abstract An essential cue of human wisdo

1 Nov 18, 2021