Bianace Prediction Pytorch Model

Overview

Bianace Prediction Pytorch Model

Main Results

ETHUSDT from 2021-01-01 00:00:00 to 2021-12-01 00:00:00

Time interval ROI
1d (Human) 2.74%
1d (Model) 125.05%
4h (Human) 36.86%
4h (Model) 300.37%
1h (Human) 37.55%
1h (Model) 393.66%

BTCUSDT from 2021-01-01 00:00:00 to 2021-12-01 00:00:00

Time interval ROI
1d (Human) 3.11%
1d (Model) 30.08%
4h (Human) 18.30%
4h (Model) 30.67%
1h (Human) 19.79%
1h (Model) 32.07%

Getting started

Environment

  • Test OS: Ubuntu 16.04 LTS
  • Python version: 3.8

Preparation

  • Create folders.
mkdir images
mkdir checkpoints
  • Please run pip install –r requirements.txt to install the needed libraries.

Dataset

Binance Public Data

  • Clone the repo.
  • Follow the instruction to download required data.
# ETHUSDT
python download-kline.py -s ETHUSDT -startDate 2017-08-01 -endDate 2021-12-01

# BTCUSDT
python download-kline.py -s BTCUSDT -startDate 2017-08-01 -endDate 2021-12-01
  • It will download the required data as below. Unzip the zip files under the 1h, 4h and 1d directories.
binance_prediction_pytorch
    `-- binance-public-data
        `-- data
            `-- data
                `-- spot
                    |-- daily
                    `-- monthly
                        `-- klines
                            |-- ETHUSDT
                            `-- BTCUSDT
  • Then soft link the data directory to the repo root as below.
binance_prediction_pytorch
    |-- binance-public-data
    `-- data
        `-- spot
            |-- daily
            `-- monthly
                `-- klines
                    |-- ETHUSDT
                    `-- BTCUSDT

Experiments

Training

  • Run training and evaluation on ETHUSDT. It will store the checkpoints under checkpoints with ticker name and time interval if don't specify the checkpoint path with --ckpt.
# 1d
./run.sh ETHUSDT 1d

# 4h
./run.sh ETHUSDT 4h --sell_rate 0.03

# 1h
./run.sh ETHUSDT 1h --sell_rate 0.03
  • Run training and evaluation on BTCUSDT
# 1d
./run.sh BTCUSDT 1d

# 4h
./run.sh BTCUSDT 4h --sell_rate 0.03

# 1h
./run.sh BTCUSDT 1h --sell_rate 0.03

Inference

  • Specify the checkpoint path with eval mode to only do the inference.
./run.sh ETHUSDT 1h --sell_rate 0.03 --ckpt ${YOUR_CHECKPOINT_PATH} --eval
Owner
RoyYang
M.S. student @ VSLab
RoyYang
The goal of the exercises below is to evaluate the candidate knowledge and problem solving expertise regarding the main development focuses for the iFood ML Platform team: MLOps and Feature Store development.

The goal of the exercises below is to evaluate the candidate knowledge and problem solving expertise regarding the main development focuses for the iFood ML Platform team: MLOps and Feature Store dev

George Rocha 0 Feb 03, 2022
Editing a Conditional Radiance Field

Editing Conditional Radiance Fields Project | Paper | Video | Demo Editing Conditional Radiance Fields Steven Liu, Xiuming Zhang, Zhoutong Zhang, Rich

Steven Liu 216 Dec 30, 2022
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

The Apache Software Foundation 20.2k Jan 08, 2023
Cross-modal Retrieval using Transformer Encoder Reasoning Networks (TERN). With use of Metric Learning and FAISS for fast similarity search on GPU

Cross-modal Retrieval using Transformer Encoder Reasoning Networks This project reimplements the idea from "Transformer Reasoning Network for Image-Te

Minh-Khoi Pham 5 Nov 05, 2022
Mapping Conditional Distributions for Domain Adaptation Under Generalized Target Shift

This repository contains the official code of OSTAR in "Mapping Conditional Distributions for Domain Adaptation Under Generalized Target Shift" (ICLR 2022).

Matthieu Kirchmeyer 5 Dec 06, 2022
Square Root Bundle Adjustment for Large-Scale Reconstruction

RootBA: Square Root Bundle Adjustment Project Page | Paper | Poster | Video | Code Table of Contents Citation Dependencies Installing dependencies on

Nikolaus Demmel 205 Dec 20, 2022
Home repository for the Regularized Greedy Forest (RGF) library. It includes original implementation from the paper and multithreaded one written in C++, along with various language-specific wrappers.

Regularized Greedy Forest Regularized Greedy Forest (RGF) is a tree ensemble machine learning method described in this paper. RGF can deliver better r

RGF-team 364 Dec 28, 2022
PyTorch code for Vision Transformers training with the Self-Supervised learning method DINO

Self-Supervised Vision Transformers with DINO PyTorch implementation and pretrained models for DINO. For details, see Emerging Properties in Self-Supe

Facebook Research 4.2k Jan 03, 2023
Multi-objective constrained optimization for energy applications via tree ensembles

Multi-objective constrained optimization for energy applications via tree ensembles

Cāš™G - Imperial College London 1 Nov 19, 2021
[TPAMI 2021] iOD: Incremental Object Detection via Meta-Learning

Incremental Object Detection via Meta-Learning To appear in an upcoming issue of the IEEE Transactions on Pattern Analysis and Machine Intelligence (T

Joseph K J 66 Jan 04, 2023
A project to make Amazon Echo respond to sign language using your webcam

Making Alexa respond to Sign Language using Tensorflow.js Try the live demo Read the Blog Post on Tensorflow's Blog Coming Soon Watch the video This p

Abhishek Singh 444 Jan 03, 2023
Data and analysis code for an MS on SK VOC genomes phenotyping/neutralisation assays

Description Summary of phylogenomic methods and analyses used in "Immunogenicity of convalescent and vaccinated sera against clinical isolates of ance

Finlay Maguire 1 Jan 06, 2022
Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper

Ponder(ing) Transformer Implementation of a Transformer that learns to adapt the number of computational steps it takes depending on the difficulty of

Phil Wang 65 Oct 04, 2022
Auxiliary Raw Net (ARawNet) is a ASVSpoof detection model taking both raw waveform and handcrafted features as inputs, to balance the trade-off between performance and model complexity.

Overview This repository is an implementation of the Auxiliary Raw Net (ARawNet), which is ASVSpoof detection system taking both raw waveform and hand

6 Jul 08, 2022
PyTorch Implementation of Unsupervised Depth Completion with Calibrated Backprojection Layers (ORAL, ICCV 2021)

Unsupervised Depth Completion with Calibrated Backprojection Layers PyTorch implementation of Unsupervised Depth Completion with Calibrated Backprojec

80 Dec 13, 2022
TDmatch is a Python library developed to perform matching tasks in three categories:

TDmatch TDmatch is a Python library developed to perform matching tasks in three categories: Text to Data which matches tuples of a table to text docu

Naser Ahmadi 5 Aug 11, 2022
A PyTorch library for Vision Transformers

VFormer A PyTorch library for Vision Transformers Getting Started Read the contributing guidelines in CONTRIBUTING.rst to learn how to start contribut

Society for Artificial Intelligence and Deep Learning 142 Nov 28, 2022
nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures.

nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures. Here you will find the scripts necessary to produce th

Jesse Willis 0 Jan 20, 2022
Supervised Contrastive Learning for Downstream Optimized Sequence Representations

SupCL-Seq šŸ“– Supervised Contrastive Learning for Downstream Optimized Sequence representations (SupCS-Seq) accepted to be published in EMNLP 2021, ext

Hooman Sedghamiz 18 Oct 21, 2022