Deep Hedging Demo - An Example of Using Machine Learning for Derivative Pricing.

Overview

Deep Hedging Demo

Pricing Derivatives using Machine Learning

Image of Demo

1) Jupyter version: Run ./colab/deep_hedging_colab.ipynb on Colab.

2) Gui version: Run python ./pyqt5/main.py Check ./requirements.txt for main dependencies.

The Black-Scholes (BS) model – developed in 1973 and based on Nobel Prize winning works – has been the de-facto standard for pricing options and other financial derivatives for nearly half a century. The model can be used, under the assumption of a perfect financial market, to calculate an options price and the associated risk sensitivities. These risk sensitivities can then be theoretically used by a trader to create a perfect hedging strategy that eliminates all risks in a portfolio of options. However, the necessary conditions for a perfect financial market, such as zero transaction cost and the possibility of continuous trading, are difficult to meet in the real world. Therefore, in practice, banks have to rely on their traders’ intuition and experience to augment the BS model hedges with manual adjustments to account for these market imperfections. The derivative desks of every bank all hedge their positions, and their PnL and risk exposure depend crucially on the quality of their hedges. If their hedges does not properly account for market imperfections, banks might underestimate the true risk exposure of their portfolios. On the other hand, if their hedges overestimate the cost of market imperfections, banks might overprice their positions (relative to their competitors) and hence risk losing trades and/or customers. Over the last few decades, the financial market has become increasingly sophisticated. Intuition and experience of traders might not be sufficiently fast and accurate to compute the impact of market imperfections on their portfolios and to come up with good manual adjustments to their BS model hedges.

These limitations of the BS model are well-known, but neither academics nor practitioners have managed to develop alternatives to properly and systematically account for market frictions – at least not successful enough to be widely adopted by banks. Could machine learning (ML) be the cure? Last year, the Risk magazine reported that JP Morgan has begun to use machine learning to hedge (a.k.a. Deep Hedging) a portion of its vanilla index options flow book and plan to roll out the similar technology for single stocks, baskets and light exotics. According to Risk.net (2019), the technology can create hedging strategies that “automatically factor in market fictions, such as transaction costs, liquidity constraints and risk limits”. More amazingly, the ML algorithm “far outperformed” hedging strategies derived from the BS model, and it could reduce the cost of hedging (in certain asset class) by “as much as 80%”. The technology has been heralded by some as “a breakthrough in quantitative finance, one that could mark the end of the Black-Scholes era.” Hence, it is not surprising that firms, such as Bank of America, Societe Generale and IBM, are reportedly developing their own ML-based system for derivative hedging.

Machine learning algorithms are often referred to as “black boxes” because of the inherent opaqueness and difficulties to inspect how an algorithm is able to accomplishing what is accomplishing. Buhler et al (2019) recently published a paper outlining the mechanism of this ground-breaking technology. We follow their outlined methodology to implement and replicate the “deep hedging” algorithm under different simulated market conditions. Given a distribution of the underlying assets and trader preference, the “deep hedging” algorithm attempts to identify the optimal hedge strategy (as a function of over 10k model parameters) that minimizes the residual risk of a hedged portfolio. We implement the “deep hedging” algorithm to demonstrate its potential benefit in a simplified yet sufficiently realistic setting. We first benchmark the deep hedging strategy against the classic Black-Scholes hedging strategy in a perfect world with no transaction cost, in which case the performance of both strategies should be similar. Then, we benchmark again in a world with market friction (i.e. non-zero transaction costs), in which case the deep hedging strategy should outperform the classic Black-Scholes hedging strategy.

References:

Risk.net, (2019). “Deep hedging and the end of the Black-Scholes era.”

Hans Buhler et al, (2019). “Deep Hedging.” Quantitative Finance, 19(8).

Owner
Yu Man Tam
Yu Man Tam
Fbone (Flask bone) is a Flask (Python microframework) starter/template/bootstrap/boilerplate application.

Fbone (Flask bone) is a Flask (Python microframework) starter/template/bootstrap/boilerplate application.

Wilson 1.7k Dec 30, 2022
Weighted K Nearest Neighbors (kNN) algorithm implemented on python from scratch.

kNN_From_Scratch I implemented the k nearest neighbors (kNN) classification algorithm on python. This algorithm is used to predict the classes of new

1 Dec 14, 2021
DeLighT: Very Deep and Light-Weight Transformers

DeLighT: Very Deep and Light-weight Transformers This repository contains the source code of our work on building efficient sequence models: DeFINE (I

Sachin Mehta 440 Dec 18, 2022
Funnels: Exact maximum likelihood with dimensionality reduction.

Funnels This repository contains the code needed to reproduce the experiments from the paper: Funnels: Exact maximum likelihood with dimensionality re

2 Apr 21, 2022
Trajectory Variational Autoencder baseline for Multi-Agent Behavior challenge 2022

MABe_2022_TVAE: a Trajectory Variational Autoencoder baseline for the 2022 Multi-Agent Behavior challenge This repository contains jupyter notebooks t

Andrew Ulmer 15 Nov 08, 2022
The MLOps platform for innovators 🚀

​ DS2.ai is an integrated AI operation solution that supports all stages from custom AI development to deployment. It is an AI-specialized platform service that collects data, builds a training datas

9 Jan 03, 2023
Turn based roguelike in python

pyTB Turn based roguelike in python Documentation can be found here: http://mcgillij.github.io/pyTB/index.html Screenshot Dependencies Written in Pyth

Jason McGillivray 4 Sep 29, 2022
Official implementation for paper Render In-between: Motion Guided Video Synthesis for Action Interpolation

Render In-between: Motion Guided Video Synthesis for Action Interpolation [Paper] [Supp] [arXiv] [4min Video] This is the official Pytorch implementat

8 Oct 27, 2022
End-to-end face detection, cropping, norm estimation, and landmark detection in a single onnx model

onnx-facial-lmk-detector End-to-end face detection, cropping, norm estimation, and landmark detection in a single onnx model, model.onnx. Demo You can

atksh 42 Dec 30, 2022
This repository contains a PyTorch implementation of "AD-NeRF: Audio Driven Neural Radiance Fields for Talking Head Synthesis".

AD-NeRF: Audio Driven Neural Radiance Fields for Talking Head Synthesis | Project Page | Paper | PyTorch implementation for the paper "AD-NeRF: Audio

551 Dec 29, 2022
The implemention of Video Depth Estimation by Fusing Flow-to-Depth Proposals

Flow-to-depth (FDNet) video-depth-estimation This is the implementation of paper Video Depth Estimation by Fusing Flow-to-Depth Proposals Jiaxin Xie,

32 Jun 14, 2022
Website for D2C paper

D2C This is the repository that contains source code for the D2C Website. If you find D2C useful for your work please cite: @article{sinha2021d2c au

1 Oct 21, 2021
CONditionals for Ordinal Regression and classification in tensorflow

Condor Ordinal regression in Tensorflow Keras Tensorflow Keras implementation of CONDOR Ordinal Regression (aka ordinal classification) by Garrett Jen

9 Jul 31, 2022
Implement Decoupled Neural Interfaces using Synthetic Gradients in Pytorch

disclaimer: this code is modified from pytorch-tutorial Image classification with synthetic gradient in Pytorch I implement the Decoupled Neural Inter

Andrew 114 Dec 22, 2022
PyTorch implementation of the ACL, 2021 paper Parameter-efficient Multi-task Fine-tuning for Transformers via Shared Hypernetworks.

Parameter-efficient Multi-task Fine-tuning for Transformers via Shared Hypernetworks This repo contains the PyTorch implementation of the ACL, 2021 pa

Rabeeh Karimi Mahabadi 98 Dec 28, 2022
This repository builds a basic vision transformer from scratch so that one beginner can understand the theory of vision transformer.

vision-transformer-from-scratch This repository includes several kinds of vision transformers from scratch so that one beginner can understand the the

1 Dec 24, 2021
Neural models of common sense. 🤖

Unicorn on Rainbow Neural models of common sense. This repository is for the paper: Unicorn on Rainbow: A Universal Commonsense Reasoning Model on a N

AI2 60 Jan 05, 2023
Official Repsoitory for "Activate or Not: Learning Customized Activation." [CVPR 2021]

CVPR 2021 | Activate or Not: Learning Customized Activation. This repository contains the official Pytorch implementation of the paper Activate or Not

184 Dec 27, 2022
Predicting Axillary Lymph Node Metastasis in Early Breast Cancer Using Deep Learning on Primary Tumor Biopsy Slides

Predicting Axillary Lymph Node Metastasis in Early Breast Cancer Using Deep Learning on Primary Tumor Biopsy Slides Project | This repo is the officia

CVSM Group - email: <a href=[email protected]"> 33 Dec 28, 2022
Camera ready code repo for the NeuRIPS 2021 paper: "Impression learning: Online representation learning with synaptic plasticity".

Impression-Learning-Camera-Ready Camera ready code repo for the NeuRIPS 2021 paper: "Impression learning: Online representation learning with synaptic

2 Feb 09, 2022