SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data (AAAI 2021)

Overview

SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data (AAAI 2021)

PyTorch implementation of SnapMix | paper

Method Overview

SnapMix

Cite

@inproceedings{huang2021snapmix,
    title={SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data},
    author={Shaoli Huang, Xinchao Wang, and Dacheng Tao},
    year={2021},
    booktitle={AAAI Conference on Artificial Intelligence},
}

Setup

Install Package Dependencies

torch
torchvision 
PyYAML
easydict
tqdm
scikit-learn
efficientnet_pytorch
pandas
opencv

Datasets

create a soft link to the dataset directory

CUB dataset

ln -s /your-path-to/CUB-dataset data/cub

Car dataset

ln -s /your-path-to/Car-dataset data/car

Aircraft dataset

ln -s /your-path-to/Aircraft-dataset data/aircraft

Training

Training with Imagenet pre-trained weights

1. Baseline and Baseline+

To train a model on CUB dataset using the Resnet-50 backbone,

python main.py # baseline

python main.py --midlevel # baseline+

To train model on other datasets using other network backbones, you can specify the following arguments:

--netname: name of network architectures (support 4 network families: ResNet,DenseNet,InceptionV3,EfficientNet)

--dataset: dataset name

For example,

python main.py --netname resnet18 --dataset cub # using the Resnet-18 backbone on CUB dataset

python main.py --netname efficientnet-b0 --dataset cub # using the EfficientNet-b0 backbone on CUB dataset

python main.py --netname inceptoinV3 --dataset aircraft # using the inceptionV3 backbone on Aircraft dataset

2. Training with mixing augmentation

Applying SnapMix in training ( we used the hyperparameter values (prob=1., beta=5) for SnapMix in most of the experiments.):

python main.py --mixmethod snapmix --beta 5 --netname resnet50 --dataset cub # baseline

python main.py --mixmethod snapmix --beta 5 --netname resnet50 --dataset cub --midlevel # baseline+

Applying other augmentation methods (currently support cutmix,cutout,and mixup) in training:

python main.py --mixmethod cutmix --beta 3 --netname resnet50 --dataset cub # training with CutMix

python main.py --mixmethod mixup --prob 0.5 --netname resnet50 --dataset cub # training with MixUp

3. Results

ResNet architecture.

Backbone Method CUB Car Aircraft
Resnet-18 Baseline 82.35% 91.15% 87.80%
Resnet-18 Baseline + SnapMix 84.29% 93.12% 90.17%
Resnet-34 Baseline 84.98% 92.02% 89.92%
Resnet-34 Baseline + SnapMix 87.06% 93.95% 92.36%
Resnet-50 Baseline 85.49% 93.04% 91.07%
Resnet-50 Baseline + SnapMix 87.75% 94.30% 92.08%
Resnet-101 Baseline 85.62% 93.09% 91.59%
Resnet-101 Baseline + SnapMix 88.45% 94.44% 93.74%
Resnet-50 Baseline+ 87.13% 93.80% 91.68%
Resnet-50 Baseline+ + SnapMix 88.70% 95.00% 93.24%
Resnet-101 Baseline+ 87.81% 93.94% 91.85%
Resnet-101 Baseline+ + SnapMix 89.32% 94.84% 94.05%

InceptionV3 architecture.

Backbone Method CUB
InceptionV3 Baseline 82.22%
InceptionV3 Baseline + SnapMix 85.54%

DenseNet architecture.

Backbone Method CUB
DenseNet121 Baseline 84.23%
DenseNet121 Baseline + SnapMix 87.42%

Training from scratch

To train a model without using ImageNet pretrained weights:

python main.py --mixmethod snapmix --prob 0.5 --netname resnet18 --dataset cub --pretrained 0 # resnet-18 backbone

python main.py --mixmethod snapmix --prob 0.5 --netname resnet50 --dataset cub --pretrained 0 # resnet-50 backbone

2. Results

Backbone Method CUB
Resnet-18 Baseline 64.98%
Resnet-18 Baseline + SnapMix 70.31%
Resnet-50 Baseline 66.92%
Resnet-50 Baseline + SnapMix 72.17%
Owner
DavidHuang
DavidHuang
Few-shot Relation Extraction via Bayesian Meta-learning on Relation Graphs

Few-shot Relation Extraction via Bayesian Meta-learning on Relation Graphs This is an implemetation of the paper Few-shot Relation Extraction via Baye

MilaGraph 36 Nov 22, 2022
Awesome Long-Tailed Learning

Awesome Long-Tailed Learning This repo pays specially attention to the long-tailed distribution, where labels follow a long-tailed or power-law distri

Stomach_ache 284 Jan 06, 2023
Betafold - AlphaFold with tunings

BetaFold We (hegelab.org) craeted this standalone AlphaFold (AlphaFold-Multimer,

2 Aug 11, 2022
Official implementation of the network presented in the paper "M4Depth: A motion-based approach for monocular depth estimation on video sequences"

M4Depth This is the reference TensorFlow implementation for training and testing depth estimation models using the method described in M4Depth: A moti

Michaël Fonder 76 Jan 03, 2023
Simple improvement of VQVAE that allow to generate x2 sized images compared to baseline

vqvae_dwt_distiller.pytorch Simple improvement of VQVAE that allow to generate x2 sized images compared to baseline. It allows to generate 512x512 ima

Sergei Belousov 25 Jul 19, 2022
PyTorch implementation of DeepUME: Learning the Universal Manifold Embedding for Robust Point Cloud Registration (BMVC 2021)

DeepUME: Learning the Universal Manifold Embedding for Robust Point Cloud Registration [video] [paper] [supplementary] [data] [thesis] Introduction De

Natalie Lang 10 Dec 14, 2022
Landmarks Recogntion Web application using Streamlit.

Landmark Recognition Web-App using Streamlit Watch Tutorial for this project Source Trained model landmarks_classifier_asia_V1/1 is taken from the Ten

Kushal Bhavsar 5 Dec 12, 2022
This repository will be a summary and outlook on all our open, medical, AI advancements.

medical by LAION This repository will be a summary and outlook on all our open, medical, AI advancements. See the medical-general channel in the medic

LAION AI 18 Dec 30, 2022
Ansible Automation Example: JSNAPY PRE/POST Upgrade Validation

Ansible Automation Example: JSNAPY PRE/POST Upgrade Validation Overview This example will show how to validate the status of our firewall before and a

Calvin Remsburg 1 Jan 07, 2022
A PyTorch Implementation of the Luna: Linear Unified Nested Attention

Unofficial PyTorch implementation of Luna: Linear Unified Nested Attention The quadratic computational and memory complexities of the Transformer’s at

Soohwan Kim 32 Nov 07, 2022
[MICCAI'20] AlignShift: Bridging the Gap of Imaging Thickness in 3D Anisotropic Volumes

AlignShift NEW: Code for our new MICCAI'21 paper "Asymmetric 3D Context Fusion for Universal Lesion Detection" will also be pushed to this repository

Medical 3D Vision 42 Jan 06, 2023
PyTorch Implementation of Small Lesion Segmentation in Brain MRIs with Subpixel Embedding (ORAL, MICCAIW 2021)

Small Lesion Segmentation in Brain MRIs with Subpixel Embedding PyTorch implementation of Small Lesion Segmentation in Brain MRIs with Subpixel Embedd

22 Oct 21, 2022
deep-table implements various state-of-the-art deep learning and self-supervised learning algorithms for tabular data using PyTorch.

deep-table implements various state-of-the-art deep learning and self-supervised learning algorithms for tabular data using PyTorch.

63 Oct 17, 2022
Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow

Mask R-CNN for Object Detection and Segmentation This is an implementation of Mask R-CNN on Python 3, Keras, and TensorFlow. The model generates bound

Matterport, Inc 22.5k Jan 04, 2023
Reducing Information Bottleneck for Weakly Supervised Semantic Segmentation (NeurIPS 2021)

Reducing Information Bottleneck for Weakly Supervised Semantic Segmentation (NeurIPS 2021) The implementation of Reducing Infromation Bottleneck for W

Jungbeom Lee 81 Dec 16, 2022
Open source Python implementation of the HDR+ photography pipeline

hdrplus-python Open source Python implementation of the HDR+ photography pipeline, originally developped by Google and presented in a 2016 article. Th

77 Jan 05, 2023
Repository for "Exploring Sparsity in Image Super-Resolution for Efficient Inference", CVPR 2021

SMSR Reposity for "Exploring Sparsity in Image Super-Resolution for Efficient Inference" [arXiv] Highlights Locate and skip redundant computation in S

Longguang Wang 225 Dec 26, 2022
Object Detection with YOLOv3

Object Detection with YOLOv3 Bu projede YOLOv3-608 modeli kullanılmıştır. Requirements Python 3.8 OpenCV Numpy Documentation Yolo ile ilgili detaylı b

Ayşe Konuş 0 Mar 27, 2022
Multitask Learning Strengthens Adversarial Robustness

Multitask Learning Strengthens Adversarial Robustness

Columbia University 15 Jun 10, 2022
This is project is the implementation of the DeepShift: Towards Multiplication-Less Neural Networks paper

DeepShift This is project is the implementation of the DeepShift: Towards Multiplication-Less Neural Networks paper, that aims to replace multiplicati

Mostafa Elhoushi 88 Dec 23, 2022