《Dual-Resolution Correspondence Network》(NeurIPS 2020)

Overview

Dual-Resolution Correspondence Network

Dual-Resolution Correspondence Network, NeurIPS 2020

Dependency

All dependencies are included in asset/dualrcnet.yml. You need to install conda first, and then run

conda env create --file asset/dualrcnet.yml 

To activate the environment, run

conda activate dualrcnet

Preparing data

We train our model on MegaDepth dataset. To prepare for the data, you need to download the MegaDepth SfM models from the MegaDepth website and download training_pairs.txt and validation_pairs.txt from this link. Then place both training_pairs.txt and validation_pairs.txt files under the downloaded directory MegaDepth_v1_SfM.

Training

After downloading the training data, run

python train.py --training_file path/to/training_pairs.txt --validation_file path/to/validation_pairs.txt --image_path path/to/MegaDepth_v1_SfM

Pre-trained model

We also provide our pre-trained model. You can download dualrc-net.pth.tar from this link and place it under the directory trained_models.

Evaluation on HPatches

The dataset can be downloaded from HPatches repo. You need to download HPatches full sequences.
After downloading the dataset, then:

  1. Browse to HPatches/
  2. Run python eval_hpatches.py --checkpoint path/to/model --root path/to/parent/directory/of/hpatches_sequences. This will generate a text file which stores the result in current directory.
  3. Open draw_graph.py. Change relevent path accordingly and run the script to draw the result.

We provide results of DualRC-Net alongside with results of other methods in directory cache-top.

Evaluation on InLoc

In order to run the InLoc evaluation, you first need to clone the InLoc demo repo, and download and compile all the required depedencies. Then:

  1. Browse to inloc/.
  2. Run python eval_inloc_extract.py adjusting the checkpoint and experiment name. This will generate a series of matches files in the inloc/matches/ directory that then need to be fed to the InLoc evaluation Matlab code.
  3. Modify the inloc/eval_inloc_compute_poses.m file provided to indicate the path of the InLoc demo repo, and the name of the experiment (the particular directory name inside inloc/matches/), and run it using Matlab.
  4. Use the inloc/eval_inloc_generate_plot.m file to plot the results from shortlist file generated in the previous stage: /your_path_to/InLoc_demo_old/experiment_name/shortlist_densePV.mat. Precomputed shortlist files are provided in inloc/shortlist.

Evaluation on Aachen Day-Night

In order to run the Aachen Day-Night evaluation, you first need to clone the Visualization benchmark repo, and download and compile all the required depedencies (note that you'll need to compile Colmap if you have not done so yet). Then:

  1. Browse to aachen_day_and_night/.
  2. Run python eval_aachen_extract.py adjusting the checkpoint and experiment name.
  3. Copy the eval_aachen_reconstruct.py file to visuallocalizationbenchmark/local_feature_evaluation and run it in the following way:
python eval_aachen_reconstruct.py 
	--dataset_path /path_to_aachen/aachen 
	--colmap_path /local/colmap/build/src/exe
	--method_name experiment_name
  1. Upload the file /path_to_aachen/aachen/Aachen_eval_[experiment_name].txt to https://www.visuallocalization.net/ to get the results on this benchmark.

BibTex

If you use this code, please cite our paper

@inproceedings{li20dualrc,
 author		= {Xinghui Li and Kai Han and Shuda Li and Victor Prisacariu},
 title   	= {Dual-Resolution Correspondence Networks},
 booktitle 	= {Conference on Neural Information Processing Systems (NeurIPS)},
 year    	= {2020},
}

Acknowledgement

Our code is based on the wonderful code provided by NCNet, Sparse-NCNet and ANC-Net.

The trained model and denoising example for paper : Cardiopulmonary Auscultation Enhancement with a Two-Stage Noise Cancellation Approach

The trained model and denoising example for paper : Cardiopulmonary Auscultation Enhancement with a Two-Stage Noise Cancellation Approach

ycj_project 1 Jan 18, 2022
Implements pytorch code for the Accelerated SGD algorithm.

AccSGD This is the code associated with Accelerated SGD algorithm used in the paper On the insufficiency of existing momentum schemes for Stochastic O

205 Jan 02, 2023
DatasetGAN: Efficient Labeled Data Factory with Minimal Human Effort

DatasetGAN This is the official code and data release for: DatasetGAN: Efficient Labeled Data Factory with Minimal Human Effort Yuxuan Zhang*, Huan Li

302 Jan 05, 2023
Distributional Sliced-Wasserstein distance code

Distributional Sliced Wasserstein distance This is a pytorch implementation of the paper "Distributional Sliced-Wasserstein and Applications to Genera

VinAI Research 39 Jan 01, 2023
NeuralWOZ: Learning to Collect Task-Oriented Dialogue via Model-based Simulation (ACL-IJCNLP 2021)

NeuralWOZ This code is official implementation of "NeuralWOZ: Learning to Collect Task-Oriented Dialogue via Model-based Simulation". Sungdong Kim, Mi

NAVER AI 31 Oct 25, 2022
Magisk module to enable hidden features on Android 12 Developer Preview 1.

Android 12 Extensions This is a Magisk module that enables hidden features on Android 12 Developer Preview 1. Features Scrolling screenshots Wallpaper

Danny Lin 384 Jan 06, 2023
Jiminy Cricket Environment (NeurIPS 2021)

Jiminy Cricket This is the repository for "What Would Jiminy Cricket Do? Towards Agents That Behave Morally" by Dan Hendrycks*, Mantas Mazeika*, Andy

Dan Hendrycks 15 Aug 29, 2022
This repository contains the code for the binaural-detection model used in the publication arXiv:2111.04637

This repository contains the code for the binaural-detection model used in the publication arXiv:2111.04637 Dependencies The model depends on the foll

Jörg Encke 2 Oct 14, 2022
A Deep Learning Based Knowledge Extraction Toolkit for Knowledge Base Population

DeepKE is a knowledge extraction toolkit supporting low-resource and document-level scenarios for entity, relation and attribute extraction. We provide comprehensive documents, Google Colab tutorials

ZJUNLP 1.6k Jan 05, 2023
BLEND: A Fast, Memory-Efficient, and Accurate Mechanism to Find Fuzzy Seed Matches

BLEND is a mechanism that can efficiently find fuzzy seed matches between sequences to significantly improve the performance and accuracy while reducing the memory space usage of two important applic

SAFARI Research Group at ETH Zurich and Carnegie Mellon University 19 Dec 26, 2022
🤗 Push your spaCy pipelines to the Hugging Face Hub

spacy-huggingface-hub: Push your spaCy pipelines to the Hugging Face Hub This package provides a CLI command for uploading any trained spaCy pipeline

Explosion 30 Oct 09, 2022
Code to replicate the key results from Exploring the Limits of Out-of-Distribution Detection

Exploring the Limits of Out-of-Distribution Detection In this repository we're collecting replications for the key experiments in the Exploring the Li

Stanislav Fort 35 Jan 03, 2023
Pytorch library for fast transformer implementations

Transformers are very successful models that achieve state of the art performance in many natural language tasks

Idiap Research Institute 1.3k Dec 30, 2022
Tensor-Based Quantum Machine Learning

TensorLy_Quantum TensorLy-Quantum is a Python library for Tensor-Based Quantum Machine Learning that builds on top of TensorLy and PyTorch. Website: h

TensorLy 85 Dec 03, 2022
The challenge for Quantum Coalition Hackathon 2021

Qchack 2021 Google Challenge This is a challenge for the brave 2021 qchack.io participants. Instructions Hello, intrepid qchacker, welcome to the G|o

quantumlib 18 May 04, 2022
Neural Fixed-Point Acceleration for Convex Optimization

Licensing The majority of neural-scs is licensed under the CC BY-NC 4.0 License, however, portions of the project are available under separate license

Facebook Research 27 Oct 06, 2022
Semiconductor Machine learning project

Wafer Fault Detection Problem Statement: Wafer (In electronics), also called a slice or substrate, is a thin slice of semiconductor, such as a crystal

kunal suryawanshi 1 Jan 15, 2022
BrainGNN - A deep learning model for data-driven discovery of functional connectivity

A deep learning model for data-driven discovery of functional connectivity https://doi.org/10.3390/a14030075 Usman Mahmood, Zengin Fu, Vince D. Calhou

Usman Mahmood 3 Aug 28, 2022
Keras implementation of the GNM model in paper ’Graph-Based Semi-Supervised Learning with Nonignorable Nonresponses‘

Graph-based joint model with Nonignorable Missingness (GNM) This is a Keras implementation of the GNM model in paper ’Graph-Based Semi-Supervised Lear

Fan Zhou 2 Apr 17, 2022
Caffe models in TensorFlow

Caffe to TensorFlow Convert Caffe models to TensorFlow. Usage Run convert.py to convert an existing Caffe model to TensorFlow. Make sure you're using

Saumitro Dasgupta 2.8k Dec 31, 2022