Joint Gaussian Graphical Model Estimation: A Survey

Overview

Joint Gaussian Graphical Model Estimation: A Survey

Test Models

  1. Fused graphical lasso [1]
  2. Group graphical lasso [1]
  3. Graphical lasso [1]
  4. Doubly joint spike-and-slab graphical lasso [2]

Installation

  1. Anaconda Environment package:
conda env create -f environment.yml
conda activate r_env2  #activate environment
  1. Install R packages
Rscript install_packages.R

Run Examples

Jupyter notebook

Saveral examples of data generation processes as well as sample codes are in the folder ./examples/jupyter_notebook

Plot ROC curve

Sample code for data generation process 1 (DGP1). The instruction for running DGP2_roc.r is the same.

cd examples/roc
### Generate simulated data, the result will be stored in ./data 
Rscript DGP1_roc.r DG [DATA DIMENSION]

### Select one of the refularization method FGL/GGL/GL. The result will be stored in ./results
Rscript DGP1_roc.r [ACTION: FGL/DGL/GL] [DATA DIMENSION]

###visualization
Rscript DGP1_roc_visualization.r
Other examples

Please check the structure tree below for more details.

Structure

├── examples
│   ├── jupyter_notebook
|   |   ├── simple_example_block.ipynb
|   |   ├── simple_example_scalefree.ipynb
|   |   ├── simple_example_ssjgl.ipynb
│   │   └── simple_example.ipynb
│   │
│   ├── roc # run & visualize ROC curve
|   |   ├── DGP1_roc_visualization.r #visualization|   ├── DGP1_roc.r # roc curve on scalefree network, common structures share same inverse convarince matrix (data generation process 1)
|   |   |                
|   |   ├── DGP2_roc_visualization.r #visualization
|   |   ├── DGP2_roc.r # roc curve on scalefree network, common structures have different inverse convarince matrices (data generation process 2)
|   |   |                    
|   |   ├── simple_roc_vis.r # visualization
|   |   └── simple_roc.r # roc curve on ramdom network
|   | 
|   ├── joint_demo.r # beautiful result on random network (Erdos-Renyi graph)            
│   ├── loss_graphsize_npAIC.r #fix p, vary n            
│   ├── loss_smallgraphsize.r #fix n, vary n             
│   ├── oos_scalefree.r # out-of-sample likelihood on scalefree network.              
│   ├── oos.r # out-of-sample likelihood on random network      
|   ├── scalefree_AIC.r # model selection on scalefree network using AIC, tune the trucation value                
|   ├── scalefree_BIC.r # model selection on scalefree network using BIC, tune the trucation value               
|   ├── simple_example_ar.r # example on AR network: model selction, fnr,fpr, Frobenious loss, etropy loss                      
|   └── simple_example_scalefree.r # example on scalefree network: model selction, fnr,fpr, Frobenious loss, etropy loss
|                          
├── R #source file
|   ├── admm.iters.R
|   ├── display.R
|   ├── eval.R
|   ├── gen_data.R
|   ├── gete.R
|   ├── JGL.R
|   ├── metrics.R
|   └── SSJGL.R
|   
├── environment.yml
├── install_packages.R
├── README.md
└── .gitignore

References

[1] Danaher, P., Wang, P., & Witten, D. M. (2014). The joint graphical lasso for inverse covariance estimation across multiple classes. Journal of the Royal Statistical Society. Series B, Statistical methodology, 76(2), 373.

[2] Zehang Richard Li, Tyler H. McCormick, and Samuel J. Clark. "Bayesian joint spike-and-slab graphical lasso". International Conference on Machine Learning, 2019.

Owner
Koyejo Lab
Koyejo Lab @ UIUC
Koyejo Lab
Conceptual 12M is a dataset containing (image-URL, caption) pairs collected for vision-and-language pre-training.

Conceptual 12M We introduce the Conceptual 12M (CC12M), a dataset with ~12 million image-text pairs meant to be used for vision-and-language pre-train

Google Research Datasets 226 Dec 07, 2022
Code for the paper "Next Generation Reservoir Computing"

Next Generation Reservoir Computing This is the code for the results and figures in our paper "Next Generation Reservoir Computing". They are written

OSU QuantInfo Lab 105 Dec 20, 2022
Source Code For Template-Based Named Entity Recognition Using BART

Template-Based NER Source Code For Template-Based Named Entity Recognition Using BART Training Training train.py Inference inference.py Corpus ATIS (h

174 Dec 19, 2022
Deep Anomaly Detection with Outlier Exposure (ICLR 2019)

Outlier Exposure This repository contains the essential code for the paper Deep Anomaly Detection with Outlier Exposure (ICLR 2019). Requires Python 3

Dan Hendrycks 464 Dec 27, 2022
PyBullet CartPole and Quadrotor environments—with CasADi symbolic a priori dynamics—for learning-based control and reinforcement learning

safe-control-gym Physics-based CartPole and Quadrotor Gym environments (using PyBullet) with symbolic a priori dynamics (using CasADi) for learning-ba

Dynamic Systems Lab 300 Dec 28, 2022
Awesome-google-colab - Google Colaboratory Notebooks and Repositories

Unofficial Google Colaboratory Notebook and Repository Gallery Please contact me to take over and revamp this repo (it gets around 30k views and 200k

Derek Snow 1.2k Jan 03, 2023
Self-Supervised Collision Handling via Generative 3D Garment Models for Virtual Try-On

Self-Supervised Collision Handling via Generative 3D Garment Models for Virtual Try-On [Project website] [Dataset] [Video] Abstract We propose a new g

71 Dec 24, 2022
Implementation of Geometric Vector Perceptron, a simple circuit for 3d rotation equivariance for learning over large biomolecules, in Pytorch. Idea proposed and accepted at ICLR 2021

Geometric Vector Perceptron Implementation of Geometric Vector Perceptron, a simple circuit with 3d rotation equivariance for learning over large biom

Phil Wang 59 Nov 24, 2022
PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)

Yonglong Tian 2.2k Jan 08, 2023
YOLOv5 detection interface - PyQt5 implementation

所有代码已上传,直接clone后,运行yolo_win.py即可开启界面。 2021/9/29:加入置信度选择 界面是在ultralytics的yolov5基础上建立的,界面使用pyqt5实现,内容较简单,娱乐而已。 功能: 模型选择 本地文件选择(视频图片均可) 开关摄像头

487 Dec 27, 2022
Convert openmmlab (not only mmdetection) series model to tensorrt

MMDet to TensorRT This project aims to convert the mmdetection model to TensorRT model end2end. Focus on object detection for now. Mask support is exp

JinTian 4 Dec 17, 2021
PyGCL: A PyTorch Library for Graph Contrastive Learning

PyGCL is a PyTorch-based open-source Graph Contrastive Learning (GCL) library, which features modularized GCL components from published papers, standa

PyGCL 588 Dec 31, 2022
A Strong Baseline for Image Semantic Segmentation

A Strong Baseline for Image Semantic Segmentation Introduction This project is an open source semantic segmentation toolbox based on PyTorch. It is ba

Clark He 49 Sep 20, 2022
Nb workflows - A workflow platform which allows you to run parameterized notebooks programmatically

NB Workflows Description If SQL is a lingua franca for querying data, Jupyter sh

Xavier Petit 6 Aug 18, 2022
Code and data for the EMNLP 2021 paper "Just Say No: Analyzing the Stance of Neural Dialogue Generation in Offensive Contexts". Coming soon!

ToxiChat Code and data for the EMNLP 2021 paper "Just Say No: Analyzing the Stance of Neural Dialogue Generation in Offensive Contexts". Install depen

Ashutosh Baheti 11 Jan 01, 2023
Winners of the Facebook Image Similarity Challenge

Winners of the Facebook Image Similarity Challenge

DrivenData 111 Jan 05, 2023
PyGCL: Graph Contrastive Learning Library for PyTorch

PyGCL: Graph Contrastive Learning for PyTorch PyGCL is an open-source library for graph contrastive learning (GCL), which features modularized GCL com

GCL: Graph Contrastive Learning Library for PyTorch 594 Jan 08, 2023
Bilinear attention networks for visual question answering

Bilinear Attention Networks This repository is the implementation of Bilinear Attention Networks for the visual question answering and Flickr30k Entit

Jin-Hwa Kim 506 Nov 29, 2022
Learning recognition/segmentation models without end-to-end training. 40%-60% less GPU memory footprint. Same training time. Better performance.

InfoPro-Pytorch The Information Propagation algorithm for training deep networks with local supervision. (ICLR 2021) Revisiting Locally Supervised Lea

78 Dec 27, 2022