Skip to content

[ICCV, 2021] Cloud Transformers: A Universal Approach To Point Cloud Processing Tasks https://arxiv.org/abs/2007.11679

License

Notifications You must be signed in to change notification settings

SamsungLabs/cloud_transformers

Repository files navigation

Cloud Transformers: A Universal Approach To Point Cloud Processing Tasks

This is an official PyTorch code repository of the paper "Cloud Transformers: A Universal Approach To Point Cloud Processing Tasks " (ICCV, 2021).

Here, we present a versatile point cloud processing block that yields state-of-the-art results on many tasks.
The key idea is to process point clouds with many cheap low-dimensional different projections followed by standard convolutions. And we do so both in parallel and sequentially.

Datasets

We provide links to the datasets we used to train/evaluate. After unpacking and preparation, please edit the dataset path (data:path field) in configs/*.yaml

Pre-trained models

We provide our pre-trained models' weights in a single archive.

Building Dependencies

To install and build all the modules required, please run:

bash ./install_deps.sh

Code Structure

In layers/cloud_transform.py the core operations are implemented (rasterization Splat and de-rasterization Slice). While in layers\mutihead_ct_*.py we provide slightly different versions of Multi-Headed Cloud Transform (MHCT).

The model zoo is situated in model_zoo, where the models for corresponding tasks are constructed of Multi-Headed Cloud Transforms.

Run

We train our models in multi-GPU setting using DistributedDataParallel. To train on n GPUs, please run the following commands:

python train_${SCRIPT_NAME}.py ${EXP_NAME} -c configs/${CONFIG_NAME}.yaml --master localhost:3315 --rank 0 --num_nodes n
...
python train_${SCRIPT_NAME}.py ${EXP_NAME} -c configs/${CONFIG_NAME}.yaml --master localhost:3315 --rank <n-1> --num_nodes n

The semantics for evaluation scripts is almost the same:

python eval_${SCRIPT_NAME}.py ${EXP_NAME} -c configs/eval/${CONFIG_NAME}.yaml

Cite

If you find our work helpful, please do not hesitate to cite us.

@inproceedings{mazur2021cloudtransformers,
  title={Cloud Transformers: A Universal Approach To Point Cloud Processing Tasks},
  author={Mazur, Kirill and Lempitsky, Victor},
  booktitle={International Conference on Computer Vision (ICCV)},
  year={2021}
}

About

[ICCV, 2021] Cloud Transformers: A Universal Approach To Point Cloud Processing Tasks https://arxiv.org/abs/2007.11679

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published