Cupytorch - A small framework mimics PyTorch using CuPy or NumPy

Overview

CuPyTorch

CuPyTorch是一个小型PyTorch,名字来源于:

  1. 不同于已有的几个使用NumPy实现PyTorch的开源项目,本项目通过CuPy支持cuda计算
  2. 发音与Cool PyTorch接近,因为使用不超过1000行纯Python代码实现PyTorch确实很cool

CuPyTorch支持numpy和cupy两种计算后端,实现大量PyTorch常用功能,力求99%兼容PyTorch语法语义,并能轻松扩展,以下列出已经完成的功能:

  • tensor:

    • tensor: 创建张量
    • arange: 区间等差张量
    • stack: 堆叠张量
    • ones/zeros, ones/zeros_like: 全1/0张量
    • rand/randn, rand/randn_like: 0~1均匀分布/高斯分布张量
    • +, -, *, /, @, **: 双目数值运算及其右值和原地操作
    • >, <, ==, >=, <=, !=: 比较运算
    • &, |, ^: 双目逻辑运算
    • ~, -: 取反/取负运算
    • []: 基本和花式索引和切片操作
    • abs, exp, log, sqrt: 数值运算
    • sum, mean: 数据归约操作
    • max/min, amax/amin, argmax/argmin: 最大/小值及其索引计算
  • autograd: 支持以上所有非整数限定运算的自动微分

  • nn:

    • Module: 模型基类,管理参数,格式化打印
    • activation: ReLU, GeLU, Sigmoid, Tanh, Softmax, LogSoftmax
    • loss: L1Loss, MSELoss, NLLLoss, CrossEntropyLoss
    • layer: Linear, Dropout ,LSTM
  • optim:

    • Optimizer: 优化器基类,管理参数,格式化打印
    • SGD, Adam: 两个最常见的优化器
    • lr_scheduler: LambdaLRStepLR学习率调度器
  • utils.data:

    • DataLoader: 批量迭代Tensor数据,支持随机打乱
    • Dataset: 数据集基类,用于继承
    • TensorDataset: 纯用Tensor构成的数据集

cloc的代码统计结果:

Language files blank comment code
Python 22 353 27 992

自动微分示例:

import cupytorch as ct

a = ct.tensor([[-1., 2], [-3., 4.]], requires_grad=True)
b = ct.tensor([[4., 3.], [2., 1.]], requires_grad=True)
c = ct.tensor([[1., 2.], [0., 2.]], requires_grad=True)
d = ct.tensor([1., -2.], requires_grad=True)
e = a @ b.T
f = (c.max(1)[0].exp() + e[:, 0] + b.pow(2) + 2 * d.reshape(2, 1).abs()).mean()
print(f)
f.backward()
print(a.grad)
print(b.grad)
print(c.grad)
print(d.grad)

# tensor(18.889057, grad_fn=<MeanBackward>)
# tensor([[2.  1.5]
#         [2.  1.5]])
# tensor([[0.  4.5]
#         [1.  0.5]])
# tensor([[0.       3.694528]
#         [0.       3.694528]])
# tensor([ 1. -1.])

手写数字识别示例:

from pathlib import Path
import cupytorch as ct
from cupytorch import nn
from cupytorch.optim import SGD
from cupytorch.optim.lr_scheduler import StepLR
from cupytorch.utils.data import TensorDataset, DataLoader


class Net(nn.Module):
    
    def __init__(self, num_pixel: int, num_class: int):
        super().__init__()
        self.num_pixel = num_pixel
        self.fc1 = nn.Linear(num_pixel, 256)
        self.fc2 = nn.Linear(256, 64)
        self.fc3 = nn.Linear(64, num_class)
        self.act = nn.ReLU()
        self.drop = nn.Dropout(0.1)
    
    def forward(self, input: ct.Tensor) -> ct.Tensor:
        output = input.view(-1, self.num_pixel)
        output = self.drop(self.act(self.fc1(output)))
        output = self.drop(self.act(self.fc2(output)))
        return self.fc3(output)


def load(path: Path):
    # define how to load data as tensor
    pass


path = Path('../datasets/MNIST')
train_dl = DataLoader(TensorDataset(load(path / 'train-images-idx3-ubyte.gz'),
                                    load(path / 'train-labels-idx1-ubyte.gz')),
                      batch_size=20, shuffle=True)
test_dl = DataLoader(TensorDataset(load(path / 't10k-images-idx3-ubyte.gz'),
                                   load(path / 't10k-labels-idx1-ubyte.gz')),
                     batch_size=20, shuffle=False)
model = Net(28 * 28, 10)
criterion = nn.CrossEntropyLoss()
optimizer = SGD(model.parameters(), lr=1e-3, momentum=0.9)
scheduler = StepLR(optimizer, 5, 0.5)

print(model)
print(optimizer)
print(criterion)

for epoch in range(10):
    losses = 0
    for step, (x, y) in enumerate(train_dl, 1):
        optimizer.zero_grad()
        z = model(x)
        loss = criterion(z, y)
        loss.backward()
        optimizer.step()
        losses += loss.item()
        if step % 500 == 0:
            losses /= 500
            print(f'Epoch: {epoch}, Train Step: {step}, Train Loss: {losses:.6f}')
            losses = 0
    scheduler.step()

examples文件夹中提供了两个完整示例:

  • MNIST数据集上使用MLP做手写数字分类
  • NN5数据集上使用LSTM做ATM机取款预测

参考:

Owner
Xingkai Yu
Xingkai Yu
Python scripts form performing stereo depth estimation using the high res stereo model in PyTorch .

PyTorch-High-Res-Stereo-Depth-Estimation Python scripts form performing stereo depth estimation using the high res stereo model in PyTorch. Stereo dep

Ibai Gorordo 26 Nov 24, 2022
Code for the paper "How Attentive are Graph Attention Networks?"

How Attentive are Graph Attention Networks? This repository is the official implementation of How Attentive are Graph Attention Networks?. The PyTorch

175 Dec 29, 2022
FAST Aiming at the problems of cumbersome steps and slow download speed of GNSS data

FAST Aiming at the problems of cumbersome steps and slow download speed of GNSS data, a relatively complete set of integrated multi-source data download terminal software fast is developed. The softw

ChangChuntao 23 Dec 31, 2022
audioLIME: Listenable Explanations Using Source Separation

audioLIME This repository contains the Python package audioLIME, a tool for creating listenable explanations for machine learning models in music info

Institute of Computational Perception 27 Dec 01, 2022
Code For TDEER: An Efficient Translating Decoding Schema for Joint Extraction of Entities and Relations (EMNLP2021)

TDEER (WIP) Code For TDEER: An Efficient Translating Decoding Schema for Joint Extraction of Entities and Relations (EMNLP2021) Overview TDEER is an e

Alipay 6 Dec 17, 2022
Public Models considered for emotion estimation from EEG

Emotion-EEG Set of models for emotion estimation from EEG. Composed by the combination of two deep-learing models learning together (RNN and CNN) with

Victor Delvigne 21 Dec 23, 2022
Tensorflow implementation of Semi-supervised Sequence Learning (https://arxiv.org/abs/1511.01432)

Transfer Learning for Text Classification with Tensorflow Tensorflow implementation of Semi-supervised Sequence Learning(https://arxiv.org/abs/1511.01

DONGJUN LEE 82 Oct 22, 2022
Flexible Networks for Learning Physical Dynamics of Deformable Objects (2021)

Flexible Networks for Learning Physical Dynamics of Deformable Objects (2021) By Jinhyung Park, Dohae Lee, In-Kwon Lee from Yonsei University (Seoul,

Jinhyung Park 0 Jan 09, 2022
A Large Scale Benchmark for Individual Treatment Effect Prediction and Uplift Modeling

large-scale-ITE-UM-benchmark This repository contains code and data to reproduce the results of the paper "A Large Scale Benchmark for Individual Trea

10 Nov 19, 2022
Compare neural networks by their feature similarity

PyTorch Model Compare A tiny package to compare two neural networks in PyTorch. There are many ways to compare two neural networks, but one robust and

Anand Krishnamoorthy 181 Jan 04, 2023
Recurrent Variational Autoencoder that generates sequential data implemented with pytorch

Pytorch Recurrent Variational Autoencoder Model: This is the implementation of Samuel Bowman's Generating Sentences from a Continuous Space with Kim's

Daniil Gavrilov 347 Nov 14, 2022
Light-Head R-CNN

Light-head R-CNN Introduction We release code for Light-Head R-CNN. This is my best practice for my research. This repo is organized as follows: light

jemmy li 835 Dec 06, 2022
A Collection of LiDAR-Camera-Calibration Papers, Toolboxes and Notes

A Collection of LiDAR-Camera-Calibration Papers, Toolboxes and Notes

443 Jan 06, 2023
Air Quality Prediction Using LSTM

AirQualityPredictionUsingLSTM In this Repo, i present to you the winning solution of smart gujarat hackathon 2019 where the task was to predict the qu

Deepak Nandwani 2 Dec 13, 2022
A Protein-RNA Interface Predictor Based on Semantics of Sequences

PRIP PRIP:A Protein-RNA Interface Predictor Based on Semantics of Sequences installation gensim==3.8.3 matplotlib==3.1.3 xgboost==1.3.3 prettytable==2

李优 0 Mar 25, 2022
Graph neural network message passing reframed as a Transformer with local attention

Adjacent Attention Network An implementation of a simple transformer that is equivalent to graph neural network where the message passing is done with

Phil Wang 49 Dec 28, 2022
3D Multi-Person Pose Estimation by Integrating Top-Down and Bottom-Up Networks

3D Multi-Person Pose Estimation by Integrating Top-Down and Bottom-Up Networks Introduction This repository contains the code and models for the follo

124 Jan 06, 2023
Mining-the-Social-Web-3rd-Edition - The official online compendium for Mining the Social Web, 3rd Edition (O'Reilly, 2018)

Mining the Social Web, 3rd Edition The official code repository for Mining the Social Web, 3rd Edition (O'Reilly, 2019). The book is available from Am

Mikhail Klassen 838 Jan 01, 2023
Functional TensorFlow Implementation of Singular Value Decomposition for paper Fast Graph Learning

tf-fsvd TensorFlow Implementation of Functional Singular Value Decomposition for paper Fast Graph Learning with Unique Optimal Solutions Cite If you f

Sami Abu-El-Haija 14 Nov 25, 2021
A smart Chat bot that can help to know about corona virus and Make prediction of corona using X-ray.

TRINIT_Hum_kuchh_nahi_karenge_ML01 Document Link https://github.com/Jatin-Goyal-552/TRINIT_Hum_kuchh_nahi_karenge_ML01/blob/main/hum_kuchh_nahi_kareng

JatinGoyal 1 Feb 03, 2022