[ICCV 2021] HRegNet: A Hierarchical Network for Large-scale Outdoor LiDAR Point Cloud Registration

Related tags

Deep LearningHRegNet
Overview

HRegNet: A Hierarchical Network for Large-scale Outdoor LiDAR Point Cloud Registration

Introduction

The repository contains the source code and pre-trained models of our paper (published on ICCV 2021): HRegNet: A Hierarchical Network for Large-scale Outdoor LiDAR Point Cloud Registration.

The overall network architecture is shown below:

Environments

The code mainly requires the following libraries and you can check requirements.txt for more environment requirements.

Please run the following commands to install point_utils

cd models/PointUtils
python setup.py install

Training device: NVIDIA RTX 3090

Datasets

The point cloud pairs list and the ground truth relative transformation are stored in data/kitti_list and data/nuscenes_list. The data of the two datasets should be organized as follows:

KITTI odometry dataset

DATA_ROOT
├── 00
│   ├── velodyne
│   ├── calib.txt
├── 01
├── ...

NuScenes dataset

DATA_ROOT
├── v1.0-trainval
│   ├── maps
│   ├── samples
│   │   ├──LIDAR_TOP
│   ├── sweeps
│   ├── v1.0-trainval
├── v1.0-test
│   ├── maps
│   ├── samples
│   │   ├──LIDAR_TOP
│   ├── sweeps
│   ├── v1.0-test

Train

The training of the whole network is divided into two steps: we firstly train the feature extraction module and then train the network based on the pretrain features.

Train feature extraction

  • Train keypoints detector by running sh scripts/train_kitti_det.sh or sh scripts/train_nusc_det.sh, please reminder to specify the GPU,DATA_ROOT,CKPT_DIR,RUNNAME,WANDB_DIR in the scripts.
  • Train descriptor by running sh scripts/train_kitti_desc.sh or sh scripts/train_nusc_desc.sh, please reminder to specify the GPU,DATA_ROOT,CKPT_DIR,RUNNAME,WANDB_DIR and PRETRAIN_DETECTOR in the scripts.

Train the whole network

Train the network by running sh scripts/train_kitti_reg.sh or sh scripts/train_nusc_reg.sh, please reminder to specify the GPU,DATA_ROOT,CKPT_DIR,RUNNAME,WANDB_DIR and PRETRAIN_FEATS in the scripts.

Update: Pretrained weights for detector and descriptor are provided in ckpt/pretrained. If you want to train descriptor, you can set PRETRAIN_DETECTOR to DATASET_keypoints.pth. If you want to train the whole network, you can set PRETRAIN_FEATS to DATASET_feats.pth.

Test

We provide pretrain models in ckpt/pretrained, please run sh scripts/test_kitti.sh or sh scripts/test_nusc.sh, please reminder to specify GPU,DATA_ROOT,SAVE_DIR in the scripts. The test results will be saved in SAVE_DIR.

Citation

If you find this project useful for your work, please consider citing:

@InProceedings{Lu_2021_HRegNet,
        author = {Lu, Fan and Chen, Guang and Liu, Yinlong and Zhang Lijun, Qu Sanqing, Liu Shu, Gu Rongqi},
        title = {HRegNet: A Hierarchical Network for Large-scale Outdoor LiDAR Point Cloud Registration},
        booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision},
        year = {2021}
}

Acknowledgments

We want to thank all the ICCV reviewers and the following open-source projects for the help of the implementation:

  • DGR(Point clouds preprocessing and evaluation)
  • PointNet++(unofficial implementation, for Furthest Points Sampling)
Owner
Intelligent Sensing, Perception and Computing Group
Intelligent Sensing, Perception and Computing Group
LightNet++: Boosted Light-weighted Networks for Real-time Semantic Segmentation

LightNet++ !!!New Repo.!!! ⇒ EfficientNet.PyTorch: Concise, Modular, Human-friendly PyTorch implementation of EfficientNet with Pre-trained Weights !!

linksense 237 Jan 05, 2023
Tutorial: Introduction to Graph Machine Learning, with Jupyter notebooks

GraphMLTutorialNLDL22 Tutorial NLDL22: Introduction to Graph Machine Learning, with Jupyter notebooks This tutorial takes place during the conference

UiT Machine Learning Group 3 Jan 10, 2022
Code associated with the paper "Towards Understanding the Data Dependency of Mixup-style Training".

Mixup-Data-Dependency Code associated with the paper "Towards Understanding the Data Dependency of Mixup-style Training". Running Alternating Line Exp

Muthu Chidambaram 0 Nov 11, 2021
ML From Scratch

ML from Scratch MACHINE LEARNING TOPICS COVERED - FROM SCRATCH Linear Regression Logistic Regression K Means Clustering K Nearest Neighbours Decision

Tanishq Gautam 66 Nov 02, 2022
Libtorch yolov3 deepsort

Overview It is for my undergrad thesis in Tsinghua University. There are four modules in the project: Detection: YOLOv3 Tracking: SORT and DeepSORT Pr

Xu Wei 226 Dec 13, 2022
LoveDA: A Remote Sensing Land-Cover Dataset for Domain Adaptive Semantic Segmentation

LoveDA: A Remote Sensing Land-Cover Dataset for Domain Adaptive Semantic Segmentation by Junjue Wang, Zhuo Zheng, Ailong Ma, Xiaoyan Lu, and Yanfei Zh

Payphone 8 Nov 21, 2022
IAUnet: Global Context-Aware Feature Learning for Person Re-Identification

IAUnet This repository contains the code for the paper: IAUnet: Global Context-Aware Feature Learning for Person Re-Identification Ruibing Hou, Bingpe

30 Jul 14, 2022
The source code for 'Noisy-Labeled NER with Confidence Estimation' accepted by NAACL 2021

Kun Liu*, Yao Fu*, Chuanqi Tan, Mosha Chen, Ningyu Zhang, Songfang Huang, Sheng Gao. Noisy-Labeled NER with Confidence Estimation. NAACL 2021. [arxiv]

30 Nov 12, 2022
Bolt Online Learning Toolbox

Bolt Online Learning Toolbox Bolt features discriminative learning of linear predictors (e.g. SVM or Logistic Regression) using fast online learning a

Peter Prettenhofer 87 Dec 12, 2022
CDTrans: Cross-domain Transformer for Unsupervised Domain Adaptation

[ICCV2021] TransReID: Transformer-based Object Re-Identification [pdf] The official repository for TransReID: Transformer-based Object Re-Identificati

DamoCV 569 Dec 30, 2022
Adaptive Denoising Training (ADT) for Recommendation.

DenoisingRec Adaptive Denoising Training for Recommendation. This is the pytorch implementation of our paper at WSDM 2021: Denoising Implicit Feedback

Wenjie Wang 51 Dec 30, 2022
ConvMixer unofficial implementation

ConvMixer ConvMixer 非官方实现 pytorch 版本已经实现。 nets 是重构版本 ,test 是官方代码 感兴趣小伙伴可以对照看一下。 keras 已经实现 tf2.x 中 是tensorflow 2 版本 gelu 激活函数要求 tf=2.4 否则使用入下代码代替gelu

Jian Tengfei 8 Jul 11, 2022
A PyTorch Toolbox for Face Recognition

FaceX-Zoo FaceX-Zoo is a PyTorch toolbox for face recognition. It provides a training module with various supervisory heads and backbones towards stat

JDAI-CV 1.6k Jan 06, 2023
Joint Unsupervised Learning (JULE) of Deep Representations and Image Clusters.

Joint Unsupervised Learning (JULE) of Deep Representations and Image Clusters. Overview This project is a Torch implementation for our CVPR 2016 paper

Jianwei Yang 278 Dec 25, 2022
Faster RCNN with PyTorch

Faster RCNN with PyTorch Note: I re-implemented faster rcnn in this project when I started learning PyTorch. Then I use PyTorch in all of my projects.

Long Chen 1.6k Dec 23, 2022
Face Transformer for Recognition

Face-Transformer This is the code of Face Transformer for Recognition (https://arxiv.org/abs/2103.14803v2). Recently there has been great interests of

Zhong Yaoyao 153 Nov 30, 2022
Contenido del curso Bases de datos del DCC PUC versión 2021-2

IIC2413 - Bases de Datos Tabla de contenidos Equipo Profesores Ayudantes Contenidos Calendario Evaluaciones Resumen de notas Foro Política de integrid

54 Nov 23, 2022
Code for CVPR2019 paper《Unequal Training for Deep Face Recognition with Long Tailed Noisy Data》

Unequal-Training-for-Deep-Face-Recognition-with-Long-Tailed-Noisy-Data. This is the code of CVPR 2019 paper《Unequal Training for Deep Face Recognition

Zhong Yaoyao 68 Jan 07, 2023
HAR-stacked-residual-bidir-LSTMs - Deep stacked residual bidirectional LSTMs for HAR

HAR-stacked-residual-bidir-LSTM The project is based on this repository which is presented as a tutorial. It consists of Human Activity Recognition (H

Guillaume Chevalier 287 Dec 27, 2022
Data & Code for ACCENTOR Adding Chit-Chat to Enhance Task-Oriented Dialogues

ACCENTOR: Adding Chit-Chat to Enhance Task-Oriented Dialogues Overview ACCENTOR consists of the human-annotated chit-chat additions to the 23.8K dialo

Facebook Research 69 Dec 29, 2022