DC3: A Learning Method for Optimization with Hard Constraints

Related tags

Deep LearningDC3
Overview

DC3: A learning method for optimization with hard constraints

This repository is by Priya L. Donti, David Rolnick, and J. Zico Kolter and contains the PyTorch source code to reproduce the experiments in our paper "DC3: A learning method for optimization with hard constraints."

If you find this repository helpful in your publications, please consider citing our paper.

@inproceedings{donti2021dc3,
  title={DC3: A learning method for optimization with hard constraints},
  author={Donti, Priya and Rolnick, David and Kolter, J Zico},
  booktitle={International Conference on Learning Representations},
  year={2021}
}

Introduction

Large optimization problems with hard constraints arise in many settings, yet classical solvers are often prohibitively slow, motivating the use of deep networks as cheap "approximate solvers." Unfortunately, naive deep learning approaches typically cannot enforce the hard constraints of such problems, leading to infeasible solutions. In this work, we present Deep Constraint Completion and Correction (DC3), an algorithm to address this challenge. Specifically, this method enforces feasibility via a differentiable procedure, which implicitly completes partial solutions to satisfy equality constraints and unrolls gradient-based corrections to satisfy inequality constraints. We demonstrate the effectiveness of DC3 in both synthetic optimization tasks and the real-world setting of AC optimal power flow, where hard constraints encode the physics of the electrical grid. In both cases, DC3 achieves near-optimal objective values while preserving feasibility.

Dependencies

  • Python 3.x
  • PyTorch >= 1.8
  • numpy/scipy/pandas
  • osqp: State-of-the-art QP solver
  • qpth: Differentiable QP solver for PyTorch
  • ipopt: Interior point solver
  • pypower: Power flow and optimal power flow solvers
  • argparse: Input argument parsing
  • pickle: Object serialization
  • hashlib: Hash functions (used to generate folder names)
  • setproctitle: Set process titles
  • waitGPU (optional): Intelligently set CUDA_VISIBLE_DEVICES

Instructions

Dataset generation

Datasets for the experiments presented in our paper are available in the datasets folder. These datasets can be generated by running the Python script make_dataset.py within each subfolder (simple, nonconvex, and acopf) corresponding to the different problem types we test.

Running experiments

Our method and baselines can be run using the following Python files:

  • method.py: Our method (DC3)
  • baseline_nn.py: Simple deep learning baseline (NN)
  • baseline_eq_nn.py: Supervised deep learning baseline with completion (Eq. NN)
  • baseline_opt.py: Traditional optimizers (Optimizer)

See each file for relevant flags to set the problem type and method parameters. Notably:

  • --probType: Problem setting to test (simple, nonconvex, or acopf57)
  • --simpleVar, --simpleIneq, simpleEq, simpleEx: If the problem setting is simple, the number of decision variables, inequalities, equalities, and datapoints, respectively.
  • --nonconvexVar, --nonconvexIneq, nonconvexEq, nonconvexEx: If the problem setting is nonconvex, the number of decision variables, inequalities, equalities, and datapoints, respectively.

Reproducing paper experiments

You can reproduce the experiments run in our paper (including baselines and ablations) via the bash script run_expers.sh. For instance, the following commands can be used to run these experiments, 8 jobs at a time:

bash run_expers.sh > commands
cat commands | xargs -n1 -P8 -I{} /bin/sh -c "{}"

The script load_results.py can be run to aggregate these results (both while experiments are running, and after they are done). In particular, this script outputs a summary of results across different replicates of the same experiment (results_summary.dict) and information on how many jobs of each type are running or done (exper_status.dict).

Generating tables

Tables can be generated via the Jupyter notebook ResultsViz.ipynb. This notebook expects the dictionary results_summary.dict as input; the version of this dictionary generated while running the experiments in the paper is available in this repository.

Owner
CMU Locus Lab
Zico Kolter's Research Group
CMU Locus Lab
Implementation of C-RNN-GAN.

Implementation of C-RNN-GAN. Publication: Title: C-RNN-GAN: Continuous recurrent neural networks with adversarial training Information: http://mogren.

Olof Mogren 427 Dec 25, 2022
Camera calibration & 3D pose estimation tools for AcinoSet

AcinoSet: A 3D Pose Estimation Dataset and Baseline Models for Cheetahs in the Wild Daniel Joska, Liam Clark, Naoya Muramatsu, Ricardo Jericevich, Fre

African Robotics Unit 42 Nov 16, 2022
Mercer Gaussian Process (MGP) and Fourier Gaussian Process (FGP) Regression

Mercer Gaussian Process (MGP) and Fourier Gaussian Process (FGP) Regression We provide the code used in our paper "How Good are Low-Rank Approximation

Aristeidis (Ares) Panos 0 Dec 13, 2021
Repo for CVPR2021 paper "QPIC: Query-Based Pairwise Human-Object Interaction Detection with Image-Wide Contextual Information"

QPIC: Query-Based Pairwise Human-Object Interaction Detection with Image-Wide Contextual Information by Masato Tamura, Hiroki Ohashi, and Tomoaki Yosh

105 Dec 23, 2022
A pytorch reprelication of the model-based reinforcement learning algorithm MBPO

Overview This is a re-implementation of the model-based RL algorithm MBPO in pytorch as described in the following paper: When to Trust Your Model: Mo

Xingyu Lin 93 Jan 05, 2023
Simple Tensorflow implementation of "Adaptive Convolutions for Structure-Aware Style Transfer" (CVPR 2021)

AdaConv — Simple TensorFlow Implementation [Paper] : Adaptive Convolutions for Structure-Aware Style Transfer (CVPR 2021) Note This repository does no

Junho Kim 26 Nov 18, 2022
The Python3 import playground

The Python3 import playground I have been confused about python modules and packages, this text tries to clear the topic up a bit. Sources: https://ch

Michael Moser 5 Feb 22, 2022
This repo contains the pytorch implementation for Dynamic Concept Learner (accepted by ICLR 2021).

DCL-PyTorch Pytorch implementation for the Dynamic Concept Learner (DCL). More details can be found at the project page. Framework Grounding Physical

Zhenfang Chen 31 Jan 06, 2023
[CVPR2021 Oral] FFB6D: A Full Flow Bidirectional Fusion Network for 6D Pose Estimation.

FFB6D This is the official source code for the CVPR2021 Oral work, FFB6D: A Full Flow Biderectional Fusion Network for 6D Pose Estimation. (Arxiv) Tab

Yisheng (Ethan) He 201 Dec 28, 2022
Annotate with anyone, anywhere.

h h is the web app that serves most of the https://hypothes.is/ website, including the web annotations API at https://hypothes.is/api/. The Hypothesis

Hypothesis 2.6k Jan 08, 2023
This is a collection of our NAS and Vision Transformer work.

AutoML - Neural Architecture Search This is a collection of our AutoML-NAS work iRPE (NEW): Rethinking and Improving Relative Position Encoding for Vi

Microsoft 832 Jan 08, 2023
Rule Based Classification Project For Python

Rule-Based-Classification-Project (ENG) Business Problem: A game company wants to create new level-based customer definitions (personas) by using some

Deniz Can OĞUZ 4 Oct 29, 2022
🔪 Elimination based Lightweight Neural Net with Pretrained Weights

ELimNet ELimNet: Eliminating Layers in a Neural Network Pretrained with Large Dataset for Downstream Task Removed top layers from pretrained Efficient

snoop2head 4 Jul 12, 2022
A PyTorch implementation of PointRend: Image Segmentation as Rendering

PointRend A PyTorch implementation of PointRend: Image Segmentation as Rendering [arxiv] [Official Implementation: Detectron2] This repo for Only Sema

AhnDW 336 Dec 26, 2022
SFD implement with pytorch

S³FD: Single Shot Scale-invariant Face Detector A PyTorch Implementation of Single Shot Scale-invariant Face Detector Description Meanwhile train hand

Jun Li 251 Dec 22, 2022
Raptor-Multi-Tool - Raptor Multi Tool With Python

Promises 🔥 20 Stars and I'll fix every error that there is 50 Stars and we will

Aran 44 Jan 04, 2023
GAN encoders in PyTorch that could match PGGAN, StyleGAN v1/v2, and BigGAN. Code also integrates the implementation of these GANs.

MTV-TSA: Adaptable GAN Encoders for Image Reconstruction via Multi-type Latent Vectors with Two-scale Attentions. This is the official code release fo

owl 37 Dec 24, 2022
Auditing Black-Box Prediction Models for Data Minimization Compliance

Data-Minimization-Auditor An auditing tool for model-instability based data minimization that is introduced in "Auditing Black-Box Prediction Models f

Bashir Rastegarpanah 2 Mar 24, 2022
KIND: an Italian Multi-Domain Dataset for Named Entity Recognition

KIND (Kessler Italian Named-entities Dataset) KIND is an Italian dataset for Named-Entity Recognition. It contains more than one million tokens with t

Digital Humanities 5 Jun 21, 2022
an implementation of Video Frame Interpolation via Adaptive Separable Convolution using PyTorch

This work has now been superseded by: https://github.com/sniklaus/revisiting-sepconv sepconv-slomo This is a reference implementation of Video Frame I

Simon Niklaus 985 Jan 08, 2023