Computationally efficient algorithm that identifies boundary points of a point cloud.

Overview

BoundaryTest

Included are MATLAB and Python packages, each of which implement efficient algorithms for boundary detection and normal vector estimation given a point cloud.

This package implements algorithms described in the paper

Calder, Park, and Slepčev. Boundary Estimation from Point Clouds: Algorithms, Guarantees and Applications. arXiv:2111.03217, 2021.

Download package

You can download the package with the Code button above or by cloning the repository with either of the commands below

git clone [email protected]:sangmin-park0/BoundaryTest
git clone https://github.com/sangmin-park0/BoundaryTest

depending on whether you prefer ssh (first) or https (second).

Usage (MATLAB package)

To use the MATLAB package, simply download the files under the folder bd_test_MATLAB.

  1. If you would like to run some quick examples in a Euclidean space, use the function distballann_norm. You can call the function by
[BP1,BP2,dtb, dtb2] = distballann_norm(n,r,L, eps, domain,dim)

Input arguments are: n (number of points), r (test radius), L (Lipschitz constant of the density from which the points are randomly sampled), eps (boundary thickness), domain (type of domain; 1 for a ball and 2 for an annulus), dim (dimension of the domain).

Outputs are: BP1 and BP2 (boundary points according to 1st order and 2nd order tests respectively, as described in the paper), dtb and dtb2 (the estimated distances from each point to the boundary, again according to 1st and 2nd order tests respectively). For example, the following code

distballann_norm(3000,0.18,2,0.03, 1, 3)

will sample n=3000 points from a ball in d=3 dimensions with radius 0.5 (fixed) from a density with Lipschitz constant L=2, then perform boundary test using the neighborhood radius r=0.18 and boundary thickness eps=0.03. Another example for the annulus, is

distballann_norm(9000,0.18,2,0.03, 2, 3)

This function will also output the following plots:

  • plot of true distance (black) versus dtb (blue hollow dots) and dtb2 (red hollow dots)
  • if the dimension is 2, the plot of the point cloud (black) and the boundary points from the 2nd order test (red hollow dots)
  1. If you already have a point cloud in a Euclidean space and the indices of points you wish to test for boundary, that's also fine! To compute boundary points with test do the following
nvec = estimated_normal(pts,r)
[bdry_pts,bdry_idx,dists] = bd_Test(pts,nvec,eps,r,test_type,test_idx)

here, the input arguments are: pts (point cloud), r (neighborhood radius), eps (thickness of the boundary region we want to identify), test_type (type of the test: 1 for 1st order, 2 for 2nd order; optional, and default value=2) test_idx (indices we wish to test for the boundary;optional, and default setting tests all points). Outputs are bdry_pts (boundary points), bdry_idx (indices of boundary points, as a subset of pts), and dists (estimated distances of tested points).

If you have a point cloud that lies in some lower-dimensional manifold embedded in a Euclidean space, instead of bd_test, use bd_test_manif in the following way

[bdry_pts,bdry_idx,dists] = bd_Test_manif(pts,nvec,eps,r,test_idx)

to obtain the same output. Again, test_idx is an optional argument, and default setting tests all points. In the manifold setting, the algorithm uses only the 2nd order test.

Usage (Python)

The Python boundary statistic is implemented in the GraphLearning Python package. Install the development version of GraphLearning from GitHub

git clone https://github.com/jwcalder/GraphLearning
cd GraphLearning
python setup.py install --user

The other required package is Annoy for fast approximate nearest neighbor searches, which should be automatically installed during the graph learning install. The 3D visualizations from our paper are generated with the Mayavi package. Mayavi can be difficult to install and currently has many issues, so any Python code related to Mayavi is commented out. If you have a working Mayavi installation, you can uncomment that code at your convenience to generate 3D visualizations of the solutions to PDEs on point clouds.

The main function for computing the boundary statistic is graphlearning.boundary_statistic. Below is an example showing how to finding boundary points from a random point cloud on the unit box in two dimensions.

import numpy as np
import graphlearning as gl

n = 5000
X = numpy.random.rand(n,2)  

r = 0.1    #Radius for boundary statistic
eps = 0.02 #Size of boundary tube to detect
S = gl.boundary_statistic(X,r)
bdy_pts = np.arange(n)[S < 3*eps/2]  #Boundary test to find boundary points

The full usage of graphlearning.boundary_statistic is copied below for convenience, and the Python folder has scripts for running the experiments from our paper concerned with solving PDEs on point clouds and detecting the boundary and depth of MNIST images. The only required arguments are X and r. Note that the function supports using a rangesearch or knnsearch for neighborhood identification for the test.

def boundary_statistic(X,r,knn=False,ReturnNormals=False,SecondOrder=True,CutOff=True,I=None,J=None,D=None):
    """Computes boundary detection statistic
    Args:
        X: nxd point cloud of points in dimension d
        r: radius for test (or number of neighbors if knn=True)
        knn: Use knn version of test (interprets r as number of neighbors)
        ReturnNormals: Whether to return normal vectors as well
        SecondOrder: Use second order test
        CutOff: Whether to use CutOff for second order test.
        I,J,D: Output of knnsearch (Optional, improves runtime if already available)
    Returns:
        Length n numpy array of test statistic. If ReturnNormals=True, then normal vectors are return as a second argument.
    """

Contact and questions

Please email [email protected] with any questions or comments.

Acknowledgements

Following people have contributed to the development of this software:

  1. Jeff Calder (University of Minnesota)

  2. Dejan Slepčev (Carnegie Mellon University)

License

MIT

A collection of papers about Transformer in the field of medical image analysis.

A collection of papers about Transformer in the field of medical image analysis.

Junyu Chen 377 Jan 05, 2023
Faster RCNN with PyTorch

Faster RCNN with PyTorch Note: I re-implemented faster rcnn in this project when I started learning PyTorch. Then I use PyTorch in all of my projects.

Long Chen 1.6k Dec 23, 2022
TensorFlow implementation of PHM (Parameterization of Hypercomplex Multiplication)

Parameterization of Hypercomplex Multiplications (PHM) This repository contains the TensorFlow implementation of PHM (Parameterization of Hypercomplex

Aston Zhang 9 Oct 26, 2022
The codes of paper 'Active-LATHE: An Active Learning Algorithm for Boosting the Error exponent for Learning Homogeneous Ising Trees'

Active-LATHE: An Active Learning Algorithm for Boosting the Error exponent for Learning Homogeneous Ising Trees This project contains the codes of pap

0 Apr 20, 2022
Social Network Ads Prediction

Social network advertising, also social media targeting, is a group of terms that are used to describe forms of online advertising that focus on social networking services.

Khazar 2 Jan 28, 2022
Modeling Category-Selective Cortical Regions with Topographic Variational Autoencoders

Modeling Category-Selective Cortical Regions with Topographic Variational Autoencoders

1 Oct 11, 2021
arxiv-sanity, but very lite, simply providing the core value proposition of the ability to tag arxiv papers of interest and have the program recommend similar papers.

arxiv-sanity, but very lite, simply providing the core value proposition of the ability to tag arxiv papers of interest and have the program recommend similar papers.

Andrej 671 Dec 31, 2022
This is the code used in the paper "Entity Embeddings of Categorical Variables".

This is the code used in the paper "Entity Embeddings of Categorical Variables". If you want to get the original version of the code used for the Kagg

Cheng Guo 845 Nov 29, 2022
Pytorch Lightning Distributed Accelerators using Ray

Distributed PyTorch Lightning Training on Ray This library adds new PyTorch Lightning accelerators for distributed training using the Ray distributed

166 Dec 27, 2022
A simple and lightweight genetic algorithm for optimization of any machine learning model

geneticml This package contains a simple and lightweight genetic algorithm for optimization of any machine learning model. Installation Use pip to ins

Allan Barcelos 8 Aug 10, 2022
This repository builds a basic vision transformer from scratch so that one beginner can understand the theory of vision transformer.

vision-transformer-from-scratch This repository includes several kinds of vision transformers from scratch so that one beginner can understand the the

1 Dec 24, 2021
Unofficial PyTorch implementation of "RTM3D: Real-time Monocular 3D Detection from Object Keypoints for Autonomous Driving" (ECCV 2020)

RTM3D-PyTorch The PyTorch Implementation of the paper: RTM3D: Real-time Monocular 3D Detection from Object Keypoints for Autonomous Driving (ECCV 2020

Nguyen Mau Dzung 271 Nov 29, 2022
The source code for the Cutoff data augmentation approach proposed in this paper: "A Simple but Tough-to-Beat Data Augmentation Approach for Natural Language Understanding and Generation".

Cutoff: A Simple Data Augmentation Approach for Natural Language This repository contains source code necessary to reproduce the results presented in

Dinghan Shen 49 Dec 22, 2022
Cross-Modal Contrastive Learning for Text-to-Image Generation

Cross-Modal Contrastive Learning for Text-to-Image Generation This repository hosts the open source JAX implementation of XMC-GAN. Setup instructions

Google Research 94 Nov 12, 2022
Prototype python implementation of the ome-ngff table spec

Prototype python implementation of the ome-ngff table spec

Kevin Yamauchi 8 Nov 20, 2022
Pytorch implementation of SimSiam Architecture

SimSiam-pytorch A simple pytorch implementation of Exploring Simple Siamese Representation Learning which is developed by Facebook AI Research (FAIR)

Saeed Shurrab 1 Oct 20, 2021
TensorFlow, PyTorch and Numpy layers for generating Orthogonal Polynomials

OrthNet TensorFlow, PyTorch and Numpy layers for generating multi-dimensional Orthogonal Polynomials 1. Installation 2. Usage 3. Polynomials 4. Base C

Chuan 29 May 25, 2022
Data loaders and abstractions for text and NLP

torchtext This repository consists of: torchtext.datasets: The raw text iterators for common NLP datasets torchtext.data: Some basic NLP building bloc

3.2k Jan 08, 2023
PointCNN: Convolution On X-Transformed Points (NeurIPS 2018)

PointCNN: Convolution On X-Transformed Points Created by Yangyan Li, Rui Bu, Mingchao Sun, Wei Wu, Xinhan Di, and Baoquan Chen. Introduction PointCNN

Yangyan Li 1.3k Dec 21, 2022
AAAI 2022: Stationary diffusion state neural estimation

Stationary Diffusion State Neural Estimation Although many graph-based clustering methods attempt to model the stationary diffusion state in their obj

绽琨 33 Nov 24, 2022