Official implementation of the paper ``Unifying Nonlocal Blocks for Neural Networks'' (ICCV'21)

Overview

Spectral Nonlocal Block

Overview

Official implementation of the paper: Unifying Nonlocal Blocks for Neural Networks (ICCV'21)

Spectral View of Nonlocal Block

Our work provide a novel perspective for the model design of non-local blocks called the Spectral View of Non-local. In this view, the non-local block can be seen as operating a set of graph filters on a fully connected weighted graph. Our spectral view can help to therorotivally anaylize exsiting non-local blocks and design novel non-local block with the help of graph signal processing (e.g. the graph neural networks).

Spectral Nonlocal Block

This repository gives the implementation of Spectral Nonlocal Block (SNL) that is theoreotically designed with the help of first-order chebyshev graph convolution. The structure of the SNL is given below:

Two main differences between SNL and exisiting nonlocals, which make SNL can concern the graph spectral:

  1. The SNL using a symmetrical affinity matrix to ensure that the graph laplacian of the fully connected weighted graph is diagonalizable.
  2. The SNL using the normalized laplacian to conform the upper bound of maximum eigenvalue (equal to 2) for arbitrary graph structure.

More novel nonlocal blocks defined with other type graph filters will release soon, for example Cheby Filter, Amma Filter, and the Cayley Filter.

Getting Starte

Requirements

PyTorch >= 0.4.1

Python >= 3.5

torchvision >= 0.2.1

termcolor >= 1.1.0

tensorboardX >= 1.9

opencv >= 3.4

Classification

To train the SNL:

  1. install the conda environment using "env.yml"
  2. Setting --data_dir as the root directory of the dataset in "train_snl.sh"
  3. Setting --dataset as the train/val dataset (cifar10/cifar100/imagenet)
  4. Setting --backbone as the backbone type (we suggest using preresnet for CIFAR and resnet for ImageNet)
  5. Setting --arch as the backbone deepth (we suggest using 20/56 for preresnet and 50 for resnet)
  6. Other parameter such as learning rate, batch size can be found/set in "train_val.py"
  7. run the code by: "sh train_snl.sh"
  8. the training log and checkpoint are saving in "save_model"

Semantic Segmentation

We also give the module/config implementated for semantic segmentation based on mmsegmentation framework, one can regist our SNL block and train our SNL for semantic segmentation (Cityscape) followed their step.

Citation

@InProceedings{Lei_2021_ICCV,
title = {Unifying Nonlocal Blocks for Neural Networks},
author = {Zhu, Lei and She, Qi and Li, Duo and Lu, Yanye and Kang, Xuejing and Hu, Jie and Wang, Changhu},
booktitle = {IEEE International Conference on Computer Vision (ICCV)},
month = {October},
year = {2021}
}

Acknowledgement

This code and our experiments are conducted based on the release code of CGNL / mmsegmentation framework / 3D-ResNet framework. Here we thank for their remarkable works.

Code in PyTorch for the convex combination linear IAF and the Householder Flow, J.M. Tomczak & M. Welling

VAE with Volume-Preserving Flows This is a PyTorch implementation of two volume-preserving flows as described in the following papers: Tomczak, J. M.,

Jakub Tomczak 87 Dec 26, 2022
Unofficial PyTorch implementation of TokenLearner by Google AI

tokenlearner-pytorch Unofficial PyTorch implementation of TokenLearner by Ryoo et al. from Google AI (abs, pdf) Installation You can install TokenLear

Rishabh Anand 46 Dec 20, 2022
Predicting Tweet Sentiment Maching Learning and streamlit

Predicting-Tweet-Sentiment-Maching-Learning-and-streamlit (I prefere using Visual Studio Code ) Open the folder in VS Code Run the first cell in requi

1 Nov 20, 2021
MetaBalance: High-Performance Neural Networks for Class-Imbalanced Data

This repository is the official PyTorch implementation of Meta-Balance. Find the paper on arxiv MetaBalance: High-Performance Neural Networks for Clas

Arpit Bansal 20 Oct 18, 2021
UmlsBERT: Clinical Domain Knowledge Augmentation of Contextual Embeddings Using the Unified Medical Language System Metathesaurus

UmlsBERT: Clinical Domain Knowledge Augmentation of Contextual Embeddings Using the Unified Medical Language System Metathesaurus General info This is

71 Oct 25, 2022
Adaptable tools to make reinforcement learning and evolutionary computation algorithms.

Pearl The Parallel Evolutionary and Reinforcement Learning Library (Pearl) is a pytorch based package with the goal of being excellent for rapid proto

38 Jan 01, 2023
Official implementation of paper "Query2Label: A Simple Transformer Way to Multi-Label Classification".

Introdunction This is the official implementation of the paper "Query2Label: A Simple Transformer Way to Multi-Label Classification". Abstract This pa

Shilong Liu 274 Dec 28, 2022
PyTorch implementation of the paper Ultra Fast Structure-aware Deep Lane Detection

PyTorch implementation of the paper Ultra Fast Structure-aware Deep Lane Detection

1.4k Jan 06, 2023
The codes for the work "Swin-Unet: Unet-like Pure Transformer for Medical Image Segmentation"

Swin-Unet The codes for the work "Swin-Unet: Unet-like Pure Transformer for Medical Image Segmentation"(https://arxiv.org/abs/2105.05537). A validatio

869 Jan 07, 2023
Multi-Scale Aligned Distillation for Low-Resolution Detection (CVPR2021)

MSAD Multi-Scale Aligned Distillation for Low-Resolution Detection Lu Qi*, Jason Kuen*, Jiuxiang Gu, Zhe Lin, Yi Wang, Yukang Chen, Yanwei Li, Jiaya J

DV Lab 115 Dec 23, 2022
Equivariant CNNs for the sphere and SO(3) implemented in PyTorch

Equivariant CNNs for the sphere and SO(3) implemented in PyTorch

Jonas Köhler 893 Dec 28, 2022
Code for the SIGGRAPH 2022 paper "DeltaConv: Anisotropic Operators for Geometric Deep Learning on Point Clouds."

DeltaConv [Paper] [Project page] Code for the SIGGRAPH 2022 paper "DeltaConv: Anisotropic Operators for Geometric Deep Learning on Point Clouds" by Ru

98 Nov 26, 2022
Docker containers of baseline agents for the Crafter environment

Crafter Baselines This repository contains Docker containers for running various baselines on the Crafter environment. Reward Agents DreamerV2 based o

Danijar Hafner 17 Sep 25, 2022
A Pytorch implementation of "Splitter: Learning Node Representations that Capture Multiple Social Contexts" (WWW 2019).

Splitter ⠀⠀ A PyTorch implementation of Splitter: Learning Node Representations that Capture Multiple Social Contexts (WWW 2019). Abstract Recent inte

Benedek Rozemberczki 201 Nov 09, 2022
Repo for CReST: A Class-Rebalancing Self-Training Framework for Imbalanced Semi-Supervised Learning

CReST in Tensorflow 2 Code for the paper: "CReST: A Class-Rebalancing Self-Training Framework for Imbalanced Semi-Supervised Learning" by Chen Wei, Ki

Google Research 75 Nov 01, 2022
Simulating an AI playing 2048 using the Expectimax algorithm

2048-expectimax Simulating an AI playing 2048 using the Expectimax algorithm The base game engine uses code from here. The AI player is modeled as a m

Subha Ramesh 2 Jan 31, 2022
RuDOLPH: One Hyper-Modal Transformer can be creative as DALL-E and smart as CLIP

[Paper] [Хабр] [Model Card] [Colab] [Kaggle] RuDOLPH 🦌 🎄 ☃️ One Hyper-Modal Transformer can be creative as DALL-E and smart as CLIP Russian Diffusio

AI Forever 232 Jan 04, 2023
The missing CMake project initializer

cmake-init - The missing CMake project initializer Opinionated CMake project initializer to generate CMake projects that are FetchContent ready, separ

1k Jan 01, 2023
Everything about being a TA for ITP/AP course!

تی‌ای بودن! تی‌ای یا دستیار استاد از نقش‌های رایج بین دانشجویان مهندسی است، این ریپوزیتوری قرار است نکات مهم درمورد تی‌ای بودن و تی ای شدن را به ما نش

<a href=[email protected]"> 14 Sep 10, 2022
GalaXC: Graph Neural Networks with Labelwise Attention for Extreme Classification

GalaXC GalaXC: Graph Neural Networks with Labelwise Attention for Extreme Classification @InProceedings{Saini21, author = {Saini, D. and Jain,

Extreme Classification 28 Dec 05, 2022