GT China coal model

Overview

GT China coal model

The full version of a China coal transport model with a very high spatial reslution.

What it does

The code works in a few steps:

  1. Take easily understandable and readable xlsx input files on networks, plants, demand etc, and create project build files form this (done in R).
  2. Take the build files and create an LP problem file from it (in python, either locally or on AWS Sagemaker).
  3. Solve the problem from the LP problem file, and write solution to a txt file (either in Cplex interactive or in python).
  4. Process the solution txt file and write to easily understandable and readable xlsx (in R). The packages required to run these scripts are included in the environment.yml (for python) and the renv.lock file (for R; this first requires installation of renv package: https://rstudio.github.io/renv/articles/renv.html; after installation run renv:restore()).

The model

The model optimizes for a minimum cost of production + transport + transmission.
Production meaning coal mining costs, transport meaning rail/truck/riverborne/ocean-going transport and handling costs, and transmission meaning inter-provincial transport of electricity via UHV cables.

Constraints in the optimization

The constraints in the mini testbench are the same as in the full model. These are:

  • Mines (or any other node) cannot supply types of coal they do not produce.
  • The flow of coal of each type out of a node cannot exceed flows of coal of each type into a node plus supply by the node (with supply being non-zero only for mines).
  • The energy content of the supply and the flows of coal of each type into a node have to be at least equal to the demand for electricity, plus other thermal coal demand, plus the energy content of flows of coal of each type out of a node. Note that only mines can supply coal, all demand for electrical power occurs in provincial demand nodes, and demand for other thermal coal is placed at city-level nodes.
  • The amount of hard coking coal (HCC) flowing into a node has to at least be equal to the steel demand multiplied by 0.581. Note that all steel demand is placed in provincial level steel demand nodes, which are connecte with uni-directional links from steel plants to steel demand nodes. This means no coal can flow out of a steel demand node and we do not need further forumalae for mass balances. Also note that we presume a mix of coking coal need to produce a ton of steel of 581 kg Hard coking coal (HCC), 176 kg of soft coking coal (SCC), and 179 kg of pulverized coal for injection (PCI).
  • The amount of soft coking coal (SCC) flowing into a node has to at least be equal to the amount of HCC flowing into that node, mulitplied with 0.581/0.176.
  • The amount of pulverized coal for injection (PCI) flowing into a node has to at least be equal to the amount of HCC flowing into that node, mulitplied with 0.581/0.179.
  • The total volume of all coal types transported along a link cannot exceed the transport capacity of that link. Note that this constraint is applied only to those links with a non-infinite transport capacity. In practive this means rail links are assumed to have a transport capacity, ocean routes, rivers, and road links are assumed to have infinite capacity.
  • The total amount of energy transported along a link cannot exceed the transmission capacity of that link. That is, the amount of each coal type multiplied with the energy content of each coal type cannot exceed the electrical transmission capacity of links. This constraint is applied only links between power plant units and provincial electricity demand nodes, as well as UHV transmission links between provincial electricity demand nodes. These are the only links along which electrical energy is transported. All other links transport physcial quantities of coal. This line simultaneously deals with the production capapcity (MW) and conversion efficiency of power plants: the energy transported over a link cannot exceed the volume of each coal type, multiplied with the energy content of each coal type, multiplied with the energy conversion factor of the link. For links between coal fired power plant units and provincial electricity demand nodes, this is equal to the conversion effincy of the power plant unit. For UHV transmission links between two provincial level electricity demand nodes, this is equal to (1- transmission losses) over that UHV line, with transmission losses calculated based on transmission distance and a benchmark loss for UHV-DC or UHV-AC lines.
  • The handling capacity of ports cannot be exceeded. Specifically, the total amount of coal flowing out of a port cannot exceed its handling capacity.
  • The production capacity of steel plants cannot be exceeded. Specifically, the total amount of hard coking coal, soft coking coal, and pulverized coal for injection flowing out of a steel plant node (and towards a provincial steel demand node) cannot exceed the steel plant's production capacity multiplied by 0.581+0.176+0.179, the mix of different coking coals needed to produce steel.

Technical notes

  • All transport costs are pre-calculated for each link, and include a fixed handling costs and a distance based transport cost, based on the type handling (origin and destaination) and type of transport (separate for rail, truck, riverborne, ocean-going. A small number of coal rail lines has specific handling and transport costs).
  • Some of the capacities are already reported in the input sheet for the edges. The physical transport capacity from this sheet is used. For capacities of ports, steel plants, and electrical transmission capacities, the data from the separate port/steel plant/electrical capacities sheets is used.
  • An exmaple lp file is included to make this reporsitory as self-conatined as possible. This lp file is zipped to stay within github file size limits.

Contributions

This project was developed by:

  • Jorrit Gosens: conceptualization, bulk of the data preparation, software implementation and debugging in R & Python;
  • Alex H. Turnbull: conceptualization, some data preparation, proof-reading and debugging of software.

License

MIT License as separately included.
In short, do what you want with this script, but refer to the original authors when you use or develop this code.

License

MIT License as separately included.
In short, do what you want with this script, but refer to the original authors when you use or develop this code.

Semantic Scholar's Author Disambiguation Algorithm & Evaluation Suite

S2AND This repository provides access to the S2AND dataset and S2AND reference model described in the paper S2AND: A Benchmark and Evaluation System f

AI2 54 Nov 28, 2022
Subgraph Based Learning of Contextual Embedding

SLiCE Self-Supervised Learning of Contextual Embeddings for Link Prediction in Heterogeneous Networks Dataset details: We use four public benchmark da

Pacific Northwest National Laboratory 27 Dec 01, 2022
A simple and lightweight genetic algorithm for optimization of any machine learning model

geneticml This package contains a simple and lightweight genetic algorithm for optimization of any machine learning model. Installation Use pip to ins

Allan Barcelos 8 Aug 10, 2022
Pcos-prediction - Predicts the likelihood of Polycystic Ovary Syndrome based on patient attributes and symptoms

PCOS Prediction 🥼 Predicts the likelihood of Polycystic Ovary Syndrome based on

Samantha Van Seters 1 Jan 10, 2022
Deep Learning Head Pose Estimation using PyTorch.

Hopenet is an accurate and easy to use head pose estimation network. Models have been trained on the 300W-LP dataset and have been tested on real data with good qualitative performance.

Nataniel Ruiz 1.3k Dec 26, 2022
OpenMMLab Semantic Segmentation Toolbox and Benchmark.

Documentation: https://mmsegmentation.readthedocs.io/ English | 简体中文 Introduction MMSegmentation is an open source semantic segmentation toolbox based

OpenMMLab 5k Dec 31, 2022
Disturbing Target Values for Neural Network regularization: attacking the loss layer to prevent overfitting

Disturbing Target Values for Neural Network regularization: attacking the loss layer to prevent overfitting 1. Classification Task PyTorch implementat

Yongho Kim 0 Apr 24, 2022
This repository contains the data and code for the paper "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors" ([email protected])

GP-VAE This repository provides datasets and code for preprocessing, training and testing models for the paper: Diverse Text Generation via Variationa

Wanyu Du 18 Dec 29, 2022
Using Machine Learning to Create High-Res Fine Art

BIG.art: Using Machine Learning to Create High-Res Fine Art How to use GLIDE and BSRGAN to create ultra-high-resolution paintings with fine details By

Robert A. Gonsalves 13 Nov 27, 2022
This is the winning solution of the Endocv-2021 grand challange.

Endocv2021-winner [Paper] This is the winning solution of the Endocv-2021 grand challange. Dependencies pytorch # tested with 1.7 and 1.8 torchvision

Vajira Thambawita 14 Dec 03, 2022
Mask2Former: Masked-attention Mask Transformer for Universal Image Segmentation in TensorFlow 2

Mask2Former: Masked-attention Mask Transformer for Universal Image Segmentation in TensorFlow 2 Bowen Cheng, Ishan Misra, Alexander G. Schwing, Alexan

Phan Nguyen 1 Dec 16, 2021
🎁 3,000,000+ Unsplash images made available for research and machine learning

The Unsplash Dataset The Unsplash Dataset is made up of over 250,000+ contributing global photographers and data sourced from hundreds of millions of

Unsplash 2k Jan 03, 2023
This is an official PyTorch implementation of Task-Adaptive Neural Network Search with Meta-Contrastive Learning (NeurIPS 2021, Spotlight).

NeurIPS 2021 (Spotlight): Task-Adaptive Neural Network Search with Meta-Contrastive Learning This is an official PyTorch implementation of Task-Adapti

Wonyong Jeong 15 Nov 21, 2022
Code for "Long-tailed Distribution Adaptation"

Long-tailed Distribution Adaptation (Accepted in ACM MM2021) This project is built upon BBN. Installation pip install -r requirements.txt Usage Traini

Zhiliang Peng 10 May 18, 2022
Baselines for TrajNet++

TrajNet++ : The Trajectory Forecasting Framework PyTorch implementation of Human Trajectory Forecasting in Crowds: A Deep Learning Perspective TrajNet

VITA lab at EPFL 183 Jan 05, 2023
Expand human face editing via Global Direction of StyleCLIP, especially to maintain similarity during editing.

Oh-My-Face This project is based on StyleCLIP, RIFE, and encoder4editing, which aims to expand human face editing via Global Direction of StyleCLIP, e

AiLin Huang 51 Nov 17, 2022
Pytorch code for our paper "Feedback Network for Image Super-Resolution" (CVPR2019)

Feedback Network for Image Super-Resolution [arXiv] [CVF] [Poster] Update: Our proposed Gated Multiple Feedback Network (GMFN) will appear in BMVC2019

Zhen Li 539 Jan 06, 2023
Source codes for Improved Few-Shot Visual Classification (CVPR 2020), Enhancing Few-Shot Image Classification with Unlabelled Examples

Source codes for Improved Few-Shot Visual Classification (CVPR 2020), Enhancing Few-Shot Image Classification with Unlabelled Examples (WACV 2022) and Beyond Simple Meta-Learning: Multi-Purpose Model

PLAI Group at UBC 42 Dec 06, 2022
Official PyTorch implementation of "Physics-aware Difference Graph Networks for Sparsely-Observed Dynamics".

Physics-aware Difference Graph Networks for Sparsely-Observed Dynamics This repository is the official PyTorch implementation of "Physics-aware Differ

USC-Melady 46 Nov 20, 2022
ShapeGlot: Learning Language for Shape Differentiation

ShapeGlot: Learning Language for Shape Differentiation Created by Panos Achlioptas, Judy Fan, Robert X.D. Hawkins, Noah D. Goodman, Leonidas J. Guibas

Panos 32 Dec 23, 2022