An atmospheric growth and evolution model based on the EVo degassing model and FastChem 2.0

Related tags

Deep LearningEVolve
Overview

EVolve

Linking planetary mantles to atmospheric chemistry through volcanism using EVo and FastChem.

Overview

EVolve is a linked mantle degassing and atmospheric growth code, which models the growth of a rocky planet's secondary atmosphere under the influence of volcanism.

Installation

EVolve is written in Python3, and is incompatible with Python 2.7. Two very useful tools to set up python environments:
Pip - package installer for Python
Anaconda - virtual environment manager

  1. Clone the repository with submodules and enter directory

    git clone --recurse-submodules [email protected]:pipliggins/evolve.git
    

    Note: If you don't clone with submodules you won't get the two modules used to run EVolve, the EVo volcanic degassing model and the FastChem equilibrium chemistry code.

  2. Compile FastChem:

    cd fastchem
    git submodules update --init --recursive
    mkdir build & cd build
    cmake -DUSE_PYTHON==ON ..
    make
    

    This will pull the pybind11 module required for the python bindings, and compile both the C++ code, and the python bindings which are used in EVolve to conect to FastChem.

    Note: FastChem is an external C++ module, used to compute atmospheric equilibrium chemistry. Therefore, to run on Windows, I recommend using WSL (Windows Subsystem for Linux) to make the process of compiling the C code easier. If you encounter installation issues relating to the cmake version, I found the accepted answer here to work for me. A list of the suggested terminal commands can also be found at the bottom of this README file.

  3. Install dependencies using either Pip install or Anaconda. Check requirements.txt for full details. If using Pip, install all dependencies from the main directory of EVolve using

    pip3 install -r requirements.txt
    

    Troubleshoot: The GMPY2 module requires several libraries (MPFR and MPC) which are not pre-loaded in some operating systems, particularly Windows. If the GMPY2 module does not install, or you have other install issues, try

    pip3 install wheel
    sudo apt install libgmp-dev libmpfr-dev libmpc-dev
    pip3 install -r requirements.txt
    

Running EVolve

EVolve can be run either with or without the FastChem equilibrium chemistry in the atmosphere. To run Evolve with FastChem, from the main directory of EVolve run

python evolve.py inputs.yaml --fastchem

The available tags are:

  • --fastchem ).This will use fastchem to run equilibrium chemistry in the atmosphere, producing more chemical species than the magma degassing model uses and enabling the atmospheric equilibrium temperature to be lower than magmatic.

  • --nocrust ).This option stops a crustal reservoir from being formed out of the degassed melt which has been erupted. Instead, the degassed melt and any volatiles remaining in it are re-incorporated back into the mantle. If this tag is NOT used, the mantle mass will gradually reduce as there is no mechanism for re-introducing the crustal material back into the mantle implemented here.

All the input models for EVolve, and the submodules EVo and FastChem are stored in the 'inputs' folder:

Filename Relevant module Properties
atm.yaml EVolve main Sets the pre-existing atmospheric chemistry and surface pressures + temperatures for the planet
mantle.yaml EVolve main Sets the initial planetary mantle/rocky body properties, including temperature, mass, fO2, the mantle volatile concentrations and the volcanic intrusive:extrusive ratio
planet.yaml EVolve main Sets generic planetary properties and important run settings, including planetary mass, radius, the amount of mantle melting occurring at each timestep and the size & number of timesteps the model will run.
chem.yaml EVo Contains the major oxide composition of the magma being input to EVo
env.yaml EVo Contains the majority of the run settings and volatile contents for the EVo run.
output.yaml EVo Stops any graphical input from EVo compared to it's default settings
config.input FastChem Sets the names and locations for input and output files for FastChem, and output settings
parameters.dat FastChem Location of elemental abundance files, and configuration parameters

Files highlighted in bold should be edited by the user; all others are optimied for EVolve and/or will be edited by the code as it is running. Explainations for each parameter setting in the EVolve files can be found at the bottom of this README file.

As EVolve runs, it creates and updates files in the outputs folder as follows:

Filename Data
atmosphere_out.csv Planetary surface pressure and atmospheric composition for tracked molecules in units of volume mixing ratios (actually mo fraction), calculated after each time step
mantle_out.csv Mantle volatile budget and fO2 after each timestep
volc_out.csv The final pressure iteration from the EVo output file in each timestep (storing melt volatile contents, atomic volatile contents, gas speciation in mol & wt fractions, etc)
fc_input.csv Generated if fastchem is selected: The input to FastChem after atmospheric mixing, and hydrogen escape if that is occuring, for each timestep.
fc_out.csv Generated if fastchem is selected: The results from FastChem after each timestep

Installation help for WSL

If you see an error saying that the installed version of cmake is too low to install FastChem, try these commands: Please note this is just a suggestion based on what worked for me, try these workarounds at your own risk!

sudo apt-get update
sudo apt-get install apt-transport-https ca-certificates gnupg software-properties-common wget

wget -O - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | sudo apt-key add -

sudo apt-add-repository 'deb https://apt.kitware.com/ubuntu/ bionic main'
sudo apt-get update

sudo apt-get install cmake
Owner
Pip Liggins
3rd year PhD student studying Earth Sciences. I model volcanic degassing chemistry and its impact on planetary atmospheres.
Pip Liggins
OHLC Average Prediction of Apple Inc. Using LSTM Recurrent Neural Network

Stock Price Prediction of Apple Inc. Using Recurrent Neural Network OHLC Average Prediction of Apple Inc. Using LSTM Recurrent Neural Network Dataset:

Nouroz Rahman 410 Jan 05, 2023
Pytorch reimplementation of the Mixer (MLP-Mixer: An all-MLP Architecture for Vision)

MLP-Mixer Pytorch reimplementation of Google's repository for the MLP-Mixer (Not yet updated on the master branch) that was released with the paper ML

Eunkwang Jeon 18 Dec 08, 2022
Code for "Training Neural Networks with Fixed Sparse Masks" (NeurIPS 2021).

Fisher Induced Sparse uncHanging (FISH) Mask This repo contains the code for Fisher Induced Sparse uncHanging (FISH) Mask training, from "Training Neu

Varun Nair 37 Dec 30, 2022
To provide 100 JAX exercises over different sections structured as a course or tutorials to teach and learn for beginners, intermediates as well as experts

JaxTon 💯 JAX exercises Mission 🚀 To provide 100 JAX exercises over different sections structured as a course or tutorials to teach and learn for beg

Rohan Rao 512 Jan 01, 2023
Using OpenAI's CLIP to upscale and enhance images

CLIP Upscaler and Enhancer Using OpenAI's CLIP to upscale and enhance images Based on nshepperd's JAX CLIP Guided Diffusion v2.4 Sample Results Viewpo

Tripp Lyons 5 Jun 14, 2022
A collection of SOTA Image Classification Models in PyTorch

A collection of SOTA Image Classification Models in PyTorch

sithu3 85 Dec 30, 2022
PHOTONAI is a high level python API for designing and optimizing machine learning pipelines.

PHOTONAI is a high level python API for designing and optimizing machine learning pipelines. We've created a system in which you can easily select and

Medical Machine Learning Lab - University of Münster 57 Nov 12, 2022
Tooling for the Common Objects In 3D dataset.

CO3D: Common Objects In 3D This repository contains a set of tools for working with the Common Objects in 3D (CO3D) dataset. Download the dataset The

Facebook Research 724 Jan 06, 2023
DeepOBS: A Deep Learning Optimizer Benchmark Suite

DeepOBS - A Deep Learning Optimizer Benchmark Suite DeepOBS is a benchmarking suite that drastically simplifies, automates and improves the evaluation

Aaron Bahde 7 May 12, 2020
PyTorch Language Model for 1-Billion Word (LM1B / GBW) Dataset

PyTorch Large-Scale Language Model A Large-Scale PyTorch Language Model trained on the 1-Billion Word (LM1B) / (GBW) dataset Latest Results 39.98 Perp

Ryan Spring 114 Nov 04, 2022
This Repo is the official CUDA implementation of ICCV 2019 Oral paper for CARAFE: Content-Aware ReAssembly of FEatures

Introduction This Repo is the official CUDA implementation of ICCV 2019 Oral paper for CARAFE: Content-Aware ReAssembly of FEatures. @inproceedings{Wa

Jiaqi Wang 42 Jan 07, 2023
Implementation of Neural Style Transfer in Pytorch

PytorchNeuralStyleTransfer Code to run Neural Style Transfer from our paper Image Style Transfer Using Convolutional Neural Networks. Also includes co

Leon Gatys 396 Dec 01, 2022
The official implementation of ICCV paper "Box-Aware Feature Enhancement for Single Object Tracking on Point Clouds".

Box-Aware Tracker (BAT) Pytorch-Lightning implementation of the Box-Aware Tracker. Box-Aware Feature Enhancement for Single Object Tracking on Point C

Kangel Zenn 5 Mar 26, 2022
CURL: Contrastive Unsupervised Representations for Reinforcement Learning

CURL Rainbow Status: Archive (code is provided as-is, no updates expected) This is an implementation of CURL: Contrastive Unsupervised Representations

Aravind Srinivas 46 Dec 12, 2022
VGG16 model-based classification project about brain tumor detection.

Brain-Tumor-Classification-with-MRI VGG16 model-based classification project about brain tumor detection. First, you can check what people are doing o

Atakan ErdoÄŸan 2 Mar 21, 2022
A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation

Aboleth A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation [1] with stochastic gradient variational Bayes

Gradient Institute 127 Dec 12, 2022
High level network definitions with pre-trained weights in TensorFlow

TensorNets High level network definitions with pre-trained weights in TensorFlow (tested with 2.1.0 = TF = 1.4.0). Guiding principles Applicability.

Taehoon Lee 1k Dec 13, 2022
Label-Free Model Evaluation with Semi-Structured Dataset Representations

Label-Free Model Evaluation with Semi-Structured Dataset Representations Prerequisites This code uses the following libraries Python 3.7 NumPy PyTorch

8 Oct 06, 2022
Easily pull telemetry data and create beautiful visualizations for analysis.

This repository is a work in progress. Anything and everything is subject to change. Porpo Table of Contents Porpo Table of Contents General Informati

Ryan Dawes 33 Nov 30, 2022
An Efficient Implementation of Analytic Mesh Algorithm for 3D Iso-surface Extraction from Neural Networks

AnalyticMesh Analytic Marching is an exact meshing solution from neural networks. Compared to standard methods, it completely avoids geometric and top

Karbo 45 Dec 21, 2022