PyTorch implementation of Memory-based semantic segmentation for off-road unstructured natural environments.

Related tags

Deep LearningMemSeg
Overview

MemSeg: Memory-based semantic segmentation for off-road unstructured natural environments

Introduction

This repository is a PyTorch implementation of Memory-based semantic segmentation for off-road unstructured natural environments. This work is based on semseg.

The codebase mainly uses ResNet18, ResNet50 and MobileNet-V2 as backbone with ASPP module and can be easily adapted to other basic semantic segmentation structures.

Sample experimented dataset is RUGD.

Requirement

Hardware: >= 11G GPU memory

Software: PyTorch>=1.0.0, python3

Usage

For installation, follow installation steps below or recommend you to refer to the instructions described here.

For its pretrained ResNet50 backbone model, you can download from URL.

Getting Started

Installation

  1. Clone this repository.
git clone https://github.com/youngsjjn/MemSeg.git
  1. Install Python dependencies.
pip install -r requirements.txt

Implementation

  1. Download datasets (i.e. RUGD) and change the root of data path in config.

Download data list of RUGD here.

  1. Inference If you want to inference on pretrained models, download pretrained network in my drive and save them in ./exp/rugd/.

Inference "ResNet50 + Deeplabv3" without the memory module

sh tool/test.sh rugd deeplab50

Inference "ResNet50 + Deeplabv3" with the memory module

sh tool/test_mem.sh rugd deeplab50mem
Network mIoU
ResNet18 + PSPNet 33.42
ResNet18 + PSPNet (Memory) 34.13
ResNet18 + Deeplabv3 33.48
ResNet18 + Deeplabv3 (Memory) 35.07
ResNet50 + Deeplabv3 36.77
ResNet50 + Deeplabv3 (Memory) 37.71
  1. Train (Evaluation is included at the end of the training) Train "ResNet50 + Deeplabv3" without the memory module
sh tool/train.sh rugd deeplab50

Train "ResNet50 + Deeplabv3" without the memory module

sh tool/train_mem.sh rugd deeplab50mem

Here, the example is for training or testing on "ResNet50 + Deeplabv3". If you want to train other networks, please change "deeplab50" or "deeplab50mem" as a postfix of a config file name.

For example, train "ResNet18 + PSPNet" with the memory module:

sh tool/train_mem.sh rugd pspnet18mem

Citation

If you like our work and use the code or models for your research, please cite our work as follows.

@article{DBLP:journals/corr/abs-2108-05635,
  author    = {Youngsaeng Jin and
               David K. Han and
               Hanseok Ko},
  title     = {Memory-based Semantic Segmentation for Off-road Unstructured Natural
               Environments},
  journal   = {CoRR},
  volume    = {abs/2108.05635},
  year      = {2021},
  url       = {https://arxiv.org/abs/2108.05635},
  eprinttype = {arXiv},
  eprint    = {2108.05635},
  timestamp = {Wed, 18 Aug 2021 19:45:42 +0200},
  biburl    = {https://dblp.org/rec/journals/corr/abs-2108-05635.bib},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}
LSTM-VAE Implementation and Relevant Evaluations

LSTM-VAE Implementation and Relevant Evaluations Before using any file in this repository, please create two directories under the root directory name

Lan Zhang 5 Oct 08, 2022
Source Code for DialogBERT: Discourse-Aware Response Generation via Learning to Recover and Rank Utterances (https://arxiv.org/pdf/2012.01775.pdf)

DialogBERT This is a PyTorch implementation of the DialogBERT model described in DialogBERT: Neural Response Generation via Hierarchical BERT with Dis

Xiaodong Gu 67 Jan 06, 2023
The official repository for "Revealing unforeseen diagnostic image features with deep learning by detecting cardiovascular diseases from apical four-chamber ultrasounds"

Revealing unforeseen diagnostic image features with deep learning by detecting cardiovascular diseases from apical four-chamber ultrasounds The why Im

3 Mar 29, 2022
Dilated RNNs in pytorch

PyTorch Dilated Recurrent Neural Networks PyTorch implementation of Dilated Recurrent Neural Networks (DilatedRNN). Getting Started Installation: $ pi

Zalando Research 200 Nov 17, 2022
Official PyTorch implementation of the preprint paper "Stylized Neural Painting", accepted to CVPR 2021.

Official PyTorch implementation of the preprint paper "Stylized Neural Painting", accepted to CVPR 2021.

Zhengxia Zou 1.5k Dec 28, 2022
A curated list of awesome Model-Based RL resources

Awesome Model-Based Reinforcement Learning This is a collection of research papers for model-based reinforcement learning (mbrl). And the repository w

OpenDILab 427 Jan 03, 2023
Pytorch implementation of "ARM: Any-Time Super-Resolution Method"

ARM-Net Dependencies Python 3.6 Pytorch 1.7 Results Train Data preprocessing cd data_scripts python extract_subimages_test.py python data_augmentation

Bohong Chen 55 Nov 24, 2022
PassAPI is a password generator in hash format and fully developed in Python, with the aim of teaching how to handle and build

simple, elegant and safe Introduction PassAPI is a password generator in hash format and fully developed in Python, with the aim of teaching how to ha

Johnsz 2 Mar 02, 2022
Music Source Separation; Train & Eval & Inference piplines and pretrained models we used for 2021 ISMIR MDX Challenge.

Introduction 1. Usage (For MSS) 1.1 Prepare running environment 1.2 Use pretrained model 1.3 Train new MSS models from scratch 1.3.1 How to train 1.3.

Leo 100 Dec 25, 2022
Learning to Simulate Dynamic Environments with GameGAN (CVPR 2020)

Learning to Simulate Dynamic Environments with GameGAN PyTorch code for GameGAN Learning to Simulate Dynamic Environments with GameGAN Seung Wook Kim,

199 Dec 26, 2022
The code for our paper CrossFormer: A Versatile Vision Transformer Based on Cross-scale Attention.

CrossFormer This repository is the code for our paper CrossFormer: A Versatile Vision Transformer Based on Cross-scale Attention. Introduction Existin

cheerss 238 Jan 06, 2023
Certified Patch Robustness via Smoothed Vision Transformers

Certified Patch Robustness via Smoothed Vision Transformers This repository contains the code for replicating the results of our paper: Certified Patc

Madry Lab 35 Dec 14, 2022
Transformer Tracking (CVPR2021)

TransT - Transformer Tracking [CVPR2021] Official implementation of the TransT (CVPR2021) , including training code and trained models. We are revisin

chenxin 465 Jan 06, 2023
A Distributional Approach To Controlled Text Generation

A Distributional Approach To Controlled Text Generation This is the repository code for the ICLR 2021 paper "A Distributional Approach to Controlled T

NAVER 102 Jan 07, 2023
Proof of concept GnuCash Webinterface

Proof of Concept GnuCash Webinterface This may one day be a something truly great. Milestones [ ] Browse accounts and view transactions [ ] Record sim

Josh 14 Dec 28, 2022
A graph-to-sequence model for one-step retrosynthesis and reaction outcome prediction.

Graph2SMILES A graph-to-sequence model for one-step retrosynthesis and reaction outcome prediction. 1. Environmental setup System requirements Ubuntu:

29 Nov 18, 2022
implicit displacement field

Geometry-Consistent Neural Shape Representation with Implicit Displacement Fields [project page][paper][cite] Geometry-Consistent Neural Shape Represe

Yifan Wang 100 Dec 19, 2022
[NeurIPS'20] Multiscale Deep Equilibrium Models

Multiscale Deep Equilibrium Models 💥 💥 💥 💥 This repo is deprecated and we will soon stop actively maintaining it, as a more up-to-date (and simple

CMU Locus Lab 221 Dec 26, 2022
Keyword-BERT: Keyword-Attentive Deep Semantic Matching

project discription An implementation of the Keyword-BERT model mentioned in my paper Keyword-Attentive Deep Semantic Matching (Plz cite this github r

1 Nov 14, 2021
Fuzzification helps developers protect the released, binary-only software from attackers who are capable of applying state-of-the-art fuzzing techniques

About Fuzzification Fuzzification helps developers protect the released, binary-only software from attackers who are capable of applying state-of-the-

gts3.org (<a href=[email protected])"> 55 Oct 25, 2022