Reproduces the results of the paper "Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable domain decomposition approach for solving differential equations".

Overview

Finite basis physics-informed neural networks (FBPINNs)


This repository reproduces the results of the paper Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable domain decomposition approach for solving differential equations, B. Moseley, T. Nissen-Meyer and A. Markham, Jul 2021 ArXiv.


Key contributions

  • Physics-informed neural networks (PINNs) offer a powerful new paradigm for solving problems relating to differential equations
  • However, a key limitation is that PINNs struggle to scale to problems with large domains and/or multi-scale solutions
  • We present finite basis physics-informed neural networks (FBPINNs), which are able to scale to these problems
  • To do so, FBPINNs use a combination of domain decomposition, subdomain normalisation and flexible training schedules
  • FBPINNs outperform PINNs in terms of accuracy and computational resources required

Workflow

FBPINNs divide the problem domain into many small, overlapping subdomains. A neural network is placed within each subdomain such that within the center of the subdomain, the network learns the full solution, whilst in the overlapping regions, the solution is defined as the sum over all overlapping networks.

We use smooth, differentiable window functions to locally confine each network to its subdomain, and the inputs of each network are individually normalised over the subdomain.

In comparison to existing domain decomposition techniques, FBPINNs do not require additional interface terms in their loss function, and they ensure the solution is continuous across subdomain interfaces by the construction of their solution ansatz.

Installation

FBPINNs only requires Python libraries to run.

We recommend setting up a new environment, for example:

conda create -n fbpinns python=3  # Use conda package manager
conda activate fbpinns

and then installing the following libraries:

conda install scipy matplotlib jupyter
conda install pytorch torchvision torchaudio cudatoolkit=10.2 -c pytorch
pip install tensorboardX

All of our work was completed using PyTorch version 1.8.1 with CUDA 10.2.

Finally, download the source code:

git clone https://github.com/benmoseley/FBPINNs.git

Getting started

The workflow to train and compare FBPINNs and PINNs is very simple to set up, and consists of three steps:

  1. Initialise a problems.Problem class, which defines the differential equation (and boundary condition) you want to solve
  2. Initialise a constants.Constants object, which defines all of the other training hyperparameters (domain, number of subdomains, training schedule, .. etc)
  3. Pass this Constants object to the main.FBPINNTrainer or main.PINNTrainer class and call the .train() method to start training.

For example, to solve the problem du/dx = cos(wx) shown above you can use the following code to train a FBPINN / PINN:

P = problems.Cos1D_1(w=1, A=0)# initialise problem class

c1 = constants.Constants(
            RUN="FBPINN_%s"%(P.name),# run name
            P=P,# problem class
            SUBDOMAIN_XS=[np.linspace(-2*np.pi,2*np.pi,5)],# defines subdomains
            SUBDOMAIN_WS=[2*np.ones(5)],# defines width of overlapping regions between subdomains
            BOUNDARY_N=(1/P.w,),# optional arguments passed to the constraining operator
            Y_N=(0,1/P.w,),# defines unnormalisation
            ACTIVE_SCHEDULER=active_schedulers.AllActiveSchedulerND,# training scheduler
            ACTIVE_SCHEDULER_ARGS=(),# training scheduler arguments
            N_HIDDEN=16,# number of hidden units in subdomain network
            N_LAYERS=2,# number of hidden layers in subdomain network
            BATCH_SIZE=(200,),# number of training points
            N_STEPS=5000,# number of training steps
            BATCH_SIZE_TEST=(400,),# number of testing points
            )

run = main.FBPINNTrainer(c1)# train FBPINN
run.train()

c2 = constants.Constants(
            RUN="PINN_%s"%(P.name),
            P=P,
            SUBDOMAIN_XS=[np.linspace(-2*np.pi,2*np.pi,5)],
            BOUNDARY_N=(1/P.w,),
            Y_N=(0,1/P.w,),
            N_HIDDEN=32,
            N_LAYERS=3,
            BATCH_SIZE=(200,),
            N_STEPS=5000,
            BATCH_SIZE_TEST=(400,),
            )

run = main.PINNTrainer(c2)# train PINN
run.train()

The training code will automatically start outputting training statistics, plots and tensorboard summaries. The tensorboard summaries can be viewed by installing tensorboard and then running the command line tensorboard --logdir fbpinns/results/summaries/.

Defining your own problem.Problem class

To learn how to define and solve your own problem, see the Defining your own problem Jupyter notebook here.

Reproducing our results

The purpose of each folder is as follows:

  • fbpinns : contains the main code which defines and trains FBPINNs.
  • analytical_solutions : contains a copy of the BURGERS_SOLUTION code used to compute the exact solution to the Burgers equation problem.
  • seismic-cpml : contains a Python implementation of the SEISMIC_CPML FD library used to solve the wave equation problem.
  • shared_modules : contains generic Python helper functions and classes.

To reproduce the results in the paper, use the following steps:

  1. Run the scripts fbpinns/paper_main_1D.py, fbpinns/paper_main_2D.py, fbpinns/paper_main_3D.py. These train and save all of the FBPINNs and PINNs presented in the paper.
  2. Run the notebook fbpinns/Paper plots.ipynb. This generates all of the plots in the paper.

Further questions?

Please raise a GitHub issue or feel free to contact us.

Owner
Ben Moseley
Physics + AI researcher at University of Oxford, ML lead at NASA Frontier Development Lab
Ben Moseley
Official PyTorch implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation

U-GAT-IT — Official PyTorch Implementation : Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Imag

Hyeonwoo Kang 2.4k Jan 04, 2023
This source code is implemented using keras library based on "Automatic ocular artifacts removal in EEG using deep learning"

CSP_Deep_EEG This source code is implemented using keras library based on "Automatic ocular artifacts removal in EEG using deep learning" {https://www

Seyed Mahdi Roostaiyan 2 Nov 08, 2022
PyTorch original implementation of Cross-lingual Language Model Pretraining.

XLM NEW: Added XLM-R model. PyTorch original implementation of Cross-lingual Language Model Pretraining. Includes: Monolingual language model pretrain

Facebook Research 2.7k Dec 27, 2022
BalaGAN: Image Translation Between Imbalanced Domains via Cross-Modal Transfer

BalaGAN: Image Translation Between Imbalanced Domains via Cross-Modal Transfer Project Page | Paper | Video State-of-the-art image-to-image translatio

47 Dec 06, 2022
Mahadi-Now - This Is Pakistani Just Now Login Tools

PAKISTANI JUST NOW LOGIN TOOLS Install apt update apt upgrade apt install python

MAHADI HASAN AFRIDI 19 Apr 06, 2022
ZSL-KG is a general-purpose zero-shot learning framework with a novel transformer graph convolutional network (TrGCN) to learn class representation from common sense knowledge graphs.

ZSL-KG is a general-purpose zero-shot learning framework with a novel transformer graph convolutional network (TrGCN) to learn class representa

Bats Research 94 Nov 21, 2022
Official Implementation of PCT

Official Implementation of PCT Prerequisites python == 3.8.5 Please make sure you have the following libraries installed: numpy torch=1.4.0 torchvisi

32 Nov 21, 2022
BEGAN in PyTorch

BEGAN in PyTorch This project is still in progress. If you are looking for the working code, use BEGAN-tensorflow. Requirements Python 2.7 Pillow tqdm

Taehoon Kim 260 Dec 07, 2022
a pytorch implementation of auto-punctuation learned character by character

Learning Auto-Punctuation by Reading Engadget Articles Link to Other of my work 🌟 Deep Learning Notes: A collection of my notes going from basic mult

Ge Yang 137 Nov 09, 2022
Image augmentation library in Python for machine learning.

Augmentor is an image augmentation library in Python for machine learning. It aims to be a standalone library that is platform and framework independe

Marcus D. Bloice 4.8k Jan 07, 2023
Official Pytorch Implementation of Unsupervised Image Denoising with Frequency Domain Knowledge

Unsupervised Image Denoising with Frequency Domain Knowledge (BMVC 2021 Oral) : Official Project Page This repository provides the official PyTorch im

Donggon Jang 12 Sep 26, 2022
Official PyTorch implementation of MX-Font (Multiple Heads are Better than One: Few-shot Font Generation with Multiple Localized Experts)

Introduction Pytorch implementation of Multiple Heads are Better than One: Few-shot Font Generation with Multiple Localized Expert. | paper Song Park1

Clova AI Research 97 Dec 23, 2022
“英特尔创新大师杯”深度学习挑战赛 赛道3:CCKS2021中文NLP地址相关性任务

ccks2021-track3 CCKS2021中文NLP地址相关性任务-赛道三-冠军方案 团队:我的加菲鱼- wodejiafeiyu 初赛第二/复赛第一/决赛第一 前言 19年开始,陆陆续续参加了一些比赛,拿到过一些top,比较懒一直都没分享过,这次比较幸运又拿了top1,打算分享下 分类的任务

shaochenjie 131 Dec 31, 2022
“Data Augmentation for Cross-Domain Named Entity Recognition” (EMNLP 2021)

Data Augmentation for Cross-Domain Named Entity Recognition Authors: Shuguang Chen, Gustavo Aguilar, Leonardo Neves and Thamar Solorio This repository

<a href=[email protected]"> 18 Sep 10, 2022
CS_Final_Metal_surface_detection - This is a final project for CoderSchool Machine Learning bootcamp on 29/12/2021.

CS_Final_Metal_surface_detection This is a final project for CoderSchool Machine Learning bootcamp on 29/12/2021. The project is based on the dataset

Cuong Vo 1 Dec 29, 2021
Code for the paper "Implicit Representations of Meaning in Neural Language Models"

Implicit Representations of Meaning in Neural Language Models Preliminaries Create and set up a conda environment as follows: conda create -n state-pr

Belinda Li 39 Nov 03, 2022
Codebase for the solution that won first place and was awarded the most human-like agent in the 2021 NeurIPS Competition MineRL BASALT Challenge.

KAIROS MineRL BASALT Codebase for the solution that won first place and was awarded the most human-like agent in the 2021 NeurIPS Competition MineRL B

Vinicius G. Goecks 37 Oct 30, 2022
Cerberus Transformer: Joint Semantic, Affordance and Attribute Parsing

Cerberus Transformer: Joint Semantic, Affordance and Attribute Parsing Paper Introduction Multi-task indoor scene understanding is widely considered a

62 Dec 05, 2022
YOLOv5 + ROS2 object detection package

YOLOv5-ROS YOLOv5 + ROS2 object detection package This program changes the input of detect.py (ultralytics/yolov5) to sensor_msgs/Image of ROS2. Requi

Ar-Ray 23 Dec 19, 2022
Bi-level feature alignment for versatile image translation and manipulation (Under submission of TPAMI)

Bi-level feature alignment for versatile image translation and manipulation (Under submission of TPAMI) Preparation Clone the Synchronized-BatchNorm-P

Fangneng Zhan 12 Aug 10, 2022