A Graph Neural Network Tool for Recovering Dense Sub-graphs in Random Dense Graphs.

Related tags

Deep LearningPYGON
Overview

PYGON

A Graph Neural Network Tool for Recovering Dense Sub-graphs in Random Dense Graphs.

Installation

This code requires to install and run the graph-measures package. Currently, we have a copy of this package that is ready to use (in "graph_calculations" directory), but it is possible to remove its content, download the graph_measures repository and follow the instructions below to be able to run this code. The conda environment for this project (part 2 in the instructions) is still required.
A detailed explanation for graph-measures package appears in a manual in graph-measures repository. Here we present short instructions:

  1. Download the graph-measures project into "graph_calculations/graph_measures".
  2. Create the anaconda environment for running this project by running conda env create -f env.yml in the terminal.
  3. Activate the new environment: conda activate boost.
  4. Move into the directory "graph_calculations/graph_measures/features_algorithms/accelerated_graph_features/src".
  5. Make the feature calculation files for motif calculations: make -f Makefile-gpu.
  6. Great! Now one should be able to run PYGON end-to-end. Remember to work in boost environment when using this code.

Note that this code was tested only on Unix machines with GPUs. Some feature calculations might not work in other machines.
Note also that the virtual environment we tried is anaconda-based.

How to Use

  • The main code directory is "model". The other directory includes the code for feature calculations and will include saved pickle files of graphs and their features.

  • For simply trying the PYGON model, one can run python pygon_main.py in a terminal. This will run a simple training of PYGON on G(500, 0.5, 20) graphs (which will be built and dumped in "graph_calculations/pkl"). One can change the parameters or graph specifications appearing there to try PYGON on graphs of other sizes, edge probabilities, planted sub-graph sizes, planted sub-graph types or model hyper-parameters.

  • More detailed performance tests can be found in performance_testing.py.

  • To run an NNI experiment on the performance of PYGON, move into "to_run_nni" and run the configuration the experiment as guided in NNI's documentation.

  • The existing algorithms to which we compared our performance, as well as a faster version of PYGON (without dumping or printing anything), can be found in other_algorithms.py.

  • The cleaning stage using the cleaning algorithm can be found in second_stage.py.

Owner
Yoram Louzoun's Lab
Yoram Louzoun's Lab
Research using Cirq!

ReCirq Research using Cirq! This project contains modules for running quantum computing applications and experiments through Cirq and Quantum Engine.

quantumlib 230 Dec 29, 2022
A PyTorch implementation of "Signed Graph Convolutional Network" (ICDM 2018).

SGCN ⠀ A PyTorch implementation of Signed Graph Convolutional Network (ICDM 2018). Abstract Due to the fact much of today's data can be represented as

Benedek Rozemberczki 251 Nov 30, 2022
Official implementation of "Learning Forward Dynamics Model and Informed Trajectory Sampler for Safe Quadruped Navigation" (RSS 2022)

Intro Official implementation of "Learning Forward Dynamics Model and Informed Trajectory Sampler for Safe Quadruped Navigation" Robotics:Science and

Yunho Kim 21 Dec 07, 2022
Game Agent Framework. Helping you create AIs / Bots that learn to play any game you own!

Serpent.AI - Game Agent Framework (Python) Update: Revival (May 2020) Development work has resumed on the framework with the aim of bringing it into 2

Serpent.AI 6.4k Jan 05, 2023
A simple program for training and testing vit

Vit This is a simple program for training and testing vit. Key requirements: torch, torchvision and timm. Dataset I put 5 categories of the cub classi

xiezhenyu 2 Oct 11, 2022
Dense Prediction Transformers

Vision Transformers for Dense Prediction This repository contains code and models for our paper: Vision Transformers for Dense Prediction René Ranftl,

Intelligent Systems Lab Org 1.3k Jan 02, 2023
links and status of cool gradio demos

awesome-demos This is a list of some wonderful demos & applications built with Gradio. Here's how to contribute yours! 🖊️ Natural language processing

Gradio 96 Dec 30, 2022
This repo generates the training data and the model for Morpheus-Deblend

Morpheus-Deblend This repo generates the training data and the model for Morpheus-Deblend. This is the active development repo for the project and as

Ryan Hausen 2 Apr 18, 2022
Breaking the Curse of Space Explosion: Towards Efficient NAS with Curriculum Search

Breaking the Curse of Space Explosion: Towards Effcient NAS with Curriculum Search Pytorch implementation for "Breaking the Curse of Space Explosion:

guoyong 17 Jan 03, 2023
Integrated Semantic and Phonetic Post-correction for Chinese Speech Recognition

Integrated Semantic and Phonetic Post-correction for Chinese Speech Recognition | paper | dataset | pretrained detection model | Authors: Yi-Chang Che

Yi-Chang Chen 1 Aug 23, 2022
Audio Domain Adaptation for Acoustic Scene Classification using Disentanglement Learning

Audio Domain Adaptation for Acoustic Scene Classification using Disentanglement Learning Reference Abeßer, J. & Müller, M. Towards Audio Domain Adapt

Jakob Abeßer 2 Jul 06, 2022
The codes and models in 'Gaze Estimation using Transformer'.

GazeTR We provide the code of GazeTR-Hybrid in "Gaze Estimation using Transformer". We recommend you to use data processing codes provided in GazeHub.

65 Dec 27, 2022
Pseudo-rng-app - whos needs science to make a random number when you have pseudoscience?

Pseudo-random numbers with pseudoscience rng is so complicated! Why cant we have a horoscopic, vibe-y way of calculating a random number? Why cant rng

Andrew Blance 1 Dec 27, 2021
Disagreement-Regularized Imitation Learning

Due to a normalization bug the expert trajectories have lower performance than the rl_baseline_zoo reported experts. Please see the following link in

Kianté Brantley 25 Apr 28, 2022
PyTorch implementation of ARM-Net: Adaptive Relation Modeling Network for Structured Data.

A ready-to-use framework of latest models for structured (tabular) data learning with PyTorch. Applications include recommendation, CRT prediction, healthcare analytics, and etc.

48 Nov 30, 2022
The code written during my Bachelor Thesis "Classification of Human Whole-Body Motion using Hidden Markov Models".

This code was written during the course of my Bachelor thesis Classification of Human Whole-Body Motion using Hidden Markov Models. Some things might

Matthias Plappert 14 Dec 06, 2022
Generalized Proximal Policy Optimization with Sample Reuse (GePPO)

Generalized Proximal Policy Optimization with Sample Reuse This repository is the official implementation of the reinforcement learning algorithm Gene

Jimmy Queeney 9 Nov 28, 2022
A pure PyTorch batched computation implementation of "CIF: Continuous Integrate-and-Fire for End-to-End Speech Recognition"

A pure PyTorch batched computation implementation of "CIF: Continuous Integrate-and-Fire for End-to-End Speech Recognition"

張致強 14 Dec 02, 2022
nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures.

nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures. Here you will find the scripts necessary to produce th

Jesse Willis 0 Jan 20, 2022
Creating a Linear Program Solver by Implementing the Simplex Method in Python with NumPy

Creating a Linear Program Solver by Implementing the Simplex Method in Python with NumPy Simplex Algorithm is a popular algorithm for linear programmi

Reda BELHAJ 2 Oct 12, 2022