This is the pytorch re-implementation of the IterNorm

Overview

IterNorm-pytorch

Pytorch reimplementation of the IterNorm methods, which is described in the following paper:

Iterative Normalization: Beyond Standardization towards Efficient Whitening

Lei Huang, Yi Zhou, Fan Zhu, Li Liu, Ling Shao

IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019 (accepted). arXiv:1904.03441

This project also provide the pytorch implementation of Decorrelated Batch Normalization (CVPR 2018, arXiv:1804.08450), more details please refer to the Torch project.

Requirements and Dependency

  • Install PyTorch with CUDA (for GPU). (Experiments are validated on python 3.6.8 and pytorch-nightly 1.0.0)
  • (For visualization if needed), install the dependency visdom by:
pip install visdom

Experiments

1. VGG-network on Cifar-10 datasets:

run the scripts in the ./cifar10/experiments/vgg. Note that the dataset root dir should be altered by setting the para '--dataset-root', and the dataset style is described as:

-<dataset-root>
|-cifar10-batches-py
||-data_batch_1
||-data_batch_2
||-data_batch_3
||-data_batch_4
||-data_batch_5
||-test_batch

If the dataset is not exist, the script will download it, under the conditioning that the dataset-root dir is existed

2. Wide-Residual-Network on Cifar-10 datasets:

run the scripts in the ./cifar10/experiments/wrn.

3. ImageNet experiments.

run the scripts in the ./ImageNet/experiment. Note that resnet18 experimetns are run on one GPU, and resnet-50/101 are run on 4 GPU in the scripts.

Note that the dataset root dir should be altered by setting the para '--dataset-root'. and the dataset style is described as:

-<dataset-root>
|-train
||-class1
||-...
||-class1000  
|-var
||-class1
||-...
||-class1000  

Using IterNorm in other projects/tasks

(1) copy ./extension/normalization/iterative_normalization.py to the respective dir.

(2) import the IterNorm class in iterative_normalization.py

(3) generally speaking, replace the BatchNorm layer by IterNorm, or add it in any place if you want to the feature/channel decorrelated. Considering the efficiency (Note that BatchNorm is intergrated in cudnn while IterNorm is based on the pytorch script without optimization), we recommend 1) replace the first BatchNorm; 2) insert extra IterNorm before the first skip connection in resnet; 3) inserted before the final linear classfier as described in the paper.

(4) Some tips related to the hyperparamters (Group size G and Iterative Number T). We recommend G=64 (i.e., the channel number in per group is 64) and T=5 by default. If you run on large batch size (e.g.>1024), you can either increase G or T. For fine tunning, fix G=64 or G=32, and search T={3,4,5,6,7,8} may help.

Owner
Lei Huang
Ph.D in BeiHang University, research interest: deep learning, semi-supervised learning, active learning and their application to visual and textual data.
Lei Huang
My take on a practical implementation of Linformer for Pytorch.

Linformer Pytorch Implementation A practical implementation of the Linformer paper. This is attention with only linear complexity in n, allowing for v

Peter 349 Dec 25, 2022
MAVE: : A Product Dataset for Multi-source Attribute Value Extraction

The dataset contains 3 million attribute-value annotations across 1257 unique categories on 2.2 million cleaned Amazon product profiles. It is a large, multi-sourced, diverse dataset for product attr

Google Research Datasets 89 Jan 08, 2023
Fully Connected DenseNet for Image Segmentation

Fully Connected DenseNets for Semantic Segmentation Fully Connected DenseNet for Image Segmentation implementation of the paper The One Hundred Layers

Somshubra Majumdar 84 Oct 31, 2022
A very simple tool to rewrite parameters such as attributes and constants for OPs in ONNX models. Simple Attribute and Constant Modifier for ONNX.

sam4onnx A very simple tool to rewrite parameters such as attributes and constants for OPs in ONNX models. Simple Attribute and Constant Modifier for

Katsuya Hyodo 6 May 15, 2022
Unofficial TensorFlow implementation of the Keyword Spotting Transformer model

Keyword Spotting Transformer This is the unofficial TensorFlow implementation of the Keyword Spotting Transformer model. This model is used to train o

Intelligent Machines Limited 8 May 11, 2022
Unsupervised Semantic Segmentation by Contrasting Object Mask Proposals.

Unsupervised Semantic Segmentation by Contrasting Object Mask Proposals This repo contains the Pytorch implementation of our paper: Unsupervised Seman

Wouter Van Gansbeke 335 Dec 28, 2022
This is the official implementation of "One Question Answering Model for Many Languages with Cross-lingual Dense Passage Retrieval".

CORA This is the official implementation of the following paper: Akari Asai, Xinyan Yu, Jungo Kasai and Hannaneh Hajishirzi. One Question Answering Mo

Akari Asai 59 Dec 28, 2022
🔥 Cannlytics-powered artificial intelligence 🤖

Cannlytics AI 🔥 Cannlytics-powered artificial intelligence 🤖 🏗️ Installation 🏃‍♀️ Quickstart 🧱 Development 🦾 Automation 💸 Support 🏛️ License ?

Cannlytics 3 Nov 11, 2022
JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation

JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation This the repository for this paper. Find extensions of this w

Zhuoyuan Mao 14 Oct 26, 2022
FeTaQA: Free-form Table Question Answering

FeTaQA: Free-form Table Question Answering FeTaQA is a Free-form Table Question Answering dataset with 10K Wikipedia-based {table, question, free-form

Language, Information, and Learning at Yale 40 Dec 13, 2022
PyTorch code of paper "LiVLR: A Lightweight Visual-Linguistic Reasoning Framework for Video Question Answering"

LiVLR-VideoQA We propose a Lightweight Visual-Linguistic Reasoning framework (LiVLR) for VideoQA. The overview of LiVLR: Evaluation on MSRVTT-QA Datas

JJ Jiang 7 Dec 30, 2022
PassAPI is a password generator in hash format and fully developed in Python, with the aim of teaching how to handle and build

simple, elegant and safe Introduction PassAPI is a password generator in hash format and fully developed in Python, with the aim of teaching how to ha

Johnsz 2 Mar 02, 2022
[Link]mareteutral - pars tradg wth M []

pairs-trading-with-ML Jonathan Larkin, August 2017 One popular strategy classification is Pairs Trading. Though this category of strategies can exhibi

Jonathan Larkin 134 Jan 06, 2023
Pytorch implementation of Each Part Matters: Local Patterns Facilitate Cross-view Geo-localization https://arxiv.org/abs/2008.11646

[TCSVT] Each Part Matters: Local Patterns Facilitate Cross-view Geo-localization LPN [Paper] NEWs Prerequisites Python 3.6 GPU Memory = 8G Numpy 1.

46 Dec 14, 2022
ElegantRL is featured with lightweight, efficient and stable, for researchers and practitioners.

Lightweight, efficient and stable implementations of deep reinforcement learning algorithms using PyTorch. 🔥

AI4Finance 2.5k Jan 08, 2023
Unsupervised Learning of Video Representations using LSTMs

Unsupervised Learning of Video Representations using LSTMs Code for paper Unsupervised Learning of Video Representations using LSTMs by Nitish Srivast

Elman Mansimov 341 Dec 20, 2022
Only valid pull requests will be allowed. Use python only and readme changes will not be accepted.

❌ This repo is excluded from hacktoberfest This repo is for python beginners and contains lot of beginner python projects for practice. You can also s

Prajjwal Pathak 50 Dec 28, 2022
Deep Learning Package based on TensorFlow

White-Box-Layer is a Python module for deep learning built on top of TensorFlow and is distributed under the MIT license. The project was started in M

YeongHyeon Park 7 Dec 27, 2021
A module that used for encrypt code which includes RSA and AES

软件加密模块 requirement: Crypto,pycryptodome,pyqt5 本地加密信息为随机字符串 使用说明 命令行参数 -h 帮助 -checkWorking 检查是否能正常工作,后接1确认指令 -checkEndDate 检查截至日期,后接1确认指令 -activateCode

2 Sep 27, 2022
💡 Type hints for Numpy

Type hints with dynamic checks for Numpy! (❒) Installation pip install nptyping (❒) Usage (❒) NDArray nptyping.NDArray lets you define the shape and

Ramon Hagenaars 377 Dec 28, 2022