Einshape: DSL-based reshaping library for JAX and other frameworks.

Related tags

Deep Learningeinshape
Overview

Einshape: DSL-based reshaping library for JAX and other frameworks.

The jnp.einsum op provides a DSL-based unified interface to matmul and tensordot ops. This einshape library is designed to offer a similar DSL-based approach to unifying reshape, squeeze, expand_dims, and transpose operations.

Some examples:

  • einshape("n->n111", x) is equivalent to expand_dims(x, axis=1) three times
  • einshape("a1b11->ab", x) is equivalent to squeeze(x, axis=[1,3,4])
  • einshape("nhwc->nchw", x) is equivalent to transpose(x, perm=[0,3,1,2])
  • einshape("mnhwc->(mn)hwc", x) is equivalent to a reshape combining the two leading dimensions
  • einshape("(mn)hwc->mnhwc", x, n=batch_size) is equivalent to a reshape splitting the leading dimension into two, using kwargs (m or n or both) to supply the necessary additional shape information
  • einshape("mn...->(mn)...", x) combines the two leading dimensions without knowing the rank of x
  • einshape("n...->n(...)", x) performs a 'batch flatten'
  • einshape("ij->ijk", x, k=3) inserts a trailing dimension and tiles along it
  • einshape("ij->i(nj)", x, n=3) tiles along the second dimension

See jax_ops.py for the JAX implementation of the einshape function. Alternatively, the parser and engine are exposed in engine.py allowing analogous implementations in TensorFlow or other frameworks.

Installation

Einshape can be installed with the following command:

pip3 install git+https://github.com/deepmind/einshape

Einshape will work with either Jax or TensorFlow. To allow for that it does not list either as a requirement, so it is necessary to ensure that Jax or TensorFlow is installed separately.

Usage

Jax version:

(ij)", a) # b is [1, 2, 3, 4] ">
from einshape import jax_einshape as einshape
from jax import numpy as jnp

a = jnp.array([[1, 2], [3, 4]])
b = einshape("ij->(ij)", a)
# b is [1, 2, 3, 4]

TensorFlow version:

(ij)", a) # b is [1, 2, 3, 4] ">
from einshape import tf_einshape as einshape
import tensorflow as tf

a = tf.constant([[1, 2], [3, 4]])
b = einshape("ij->(ij)", a)
# b is [1, 2, 3, 4]

Understanding einshape equations

An einshape equation is always of the form {lhs}->{rhs}, where {lhs} and {rhs} both stand for expressions. An expression represents the axes of an array; the relationship between two expressions illustrate how an array should be transformed.

An expression is a non-empty sequence of the following elements:

Index name

A single letter a-z, representing one axis of an array.

For example, the expressions ab and jq both represent an array of rank 2.

Every index name that is present on the left-hand side of an equation must also be present on the right-hand side. So, ab->a is not a valid equation, but a->ba is valid (and will tile a vector b times).

Ellipsis

..., representing any axes of an array that are not otherwise represented in the expression. This is similar to the use of -1 as an axis in a reshape operation.

For example, a...b can represent any array of rank 2 or more: a will refer to the first axis and b to the last. The equation ...ab->...ba will swap the last two axes of an array.

An expression may not include more than one ellipsis (because that would be ambiguous). Like an index name, an ellipsis must be present in both halves of an equation or neither.

Group

({components}), where components is a sequence of index names and ellipsis elements. The entire group corresponds to a single axis of the array; the group's components represent factors of the axis size. This can be used to reshape an axis into many axes. All the factors except at most one must be specified using keyword arguments.

For example, einshape('(ab)->ab', x, a=10) reshapes an array of rank 1 (whose length must be a multiple of 10) into an array of rank 2 (whose first dimension is of length 10).

Groups may not be nested.

Unit

The digit 1, representing a single axis of length 1. This is useful for expanding and squeezing unit dimensions.

For example, the equation 1...->... squeezes a leading axis (which must have length one).

Disclaimer

This is not an official Google product.

Einshape Logo

Owner
DeepMind
DeepMind
Speech Enhancement Generative Adversarial Network Based on Asymmetric AutoEncoder

ASEGAN: Speech Enhancement Generative Adversarial Network Based on Asymmetric AutoEncoder 中文版简介 Readme with English Version 介绍 基于SEGAN模型的改进版本,使用自主设计的非

Nitin 53 Nov 17, 2022
A toolkit for Lagrangian-based constrained optimization in Pytorch

Cooper About Cooper is a toolkit for Lagrangian-based constrained optimization in Pytorch. This library aims to encourage and facilitate the study of

Cooper 34 Jan 01, 2023
This repository contains the map content ontology used in narrative cartography

Narrative-cartography-ontology This repository contains the map content ontology used in narrative cartography, which is associated with a submission

Weiming Huang 0 Oct 31, 2021
Unofficial implementation of Fast-SCNN: Fast Semantic Segmentation Network

Fast-SCNN: Fast Semantic Segmentation Network Unofficial implementation of the model architecture of Fast-SCNN. Real-time Semantic Segmentation and mo

Philip Popien 69 Aug 11, 2022
The official repo for CVPR2021——ViPNAS: Efficient Video Pose Estimation via Neural Architecture Search.

ViPNAS: Efficient Video Pose Estimation via Neural Architecture Search [paper] Introduction This is the official implementation of ViPNAS: Efficient V

Lumin 42 Sep 26, 2022
TensorFlow port of PyTorch Image Models (timm) - image models with pretrained weights.

TensorFlow-Image-Models Introduction Usage Models Profiling License Introduction TensorfFlow-Image-Models (tfimm) is a collection of image models with

Martins Bruveris 227 Dec 20, 2022
PyTorch code for Composing Partial Differential Equations with Physics-Aware Neural Networks

FInite volume Neural Network (FINN) This repository contains the PyTorch code for models, training, and testing, and Python code for data generation t

Cognitive Modeling 20 Dec 18, 2022
THIS IS THE **OLD** PYMC PROJECT. PLEASE USE PYMC3 INSTEAD:

Introduction Version: 2.3.8 Authors: Chris Fonnesbeck Anand Patil David Huard John Salvatier Web site: https://github.com/pymc-devs/pymc Documentation

PyMC 7.2k Jan 07, 2023
Keras-tensorflow implementation of Fully Convolutional Networks for Semantic Segmentation(Unfinished)

Keras-FCN Fully convolutional networks and semantic segmentation with Keras. Models Models are found in models.py, and include ResNet and DenseNet bas

645 Dec 29, 2022
Make a surveillance camera from your raspberry pi!

rpi-surveillance Make a surveillance camera from your Raspberry Pi 4! The surveillance is built as following: the camera records 10 seconds video and

Vladyslav 62 Feb 03, 2022
An implementation for the loss function proposed in Decoupled Contrastive Loss paper.

Decoupled-Contrastive-Learning This repository is an implementation for the loss function proposed in Decoupled Contrastive Loss paper. Requirements P

Ramin Nakhli 71 Dec 04, 2022
Spatial Attentive Single-Image Deraining with a High Quality Real Rain Dataset (CVPR'19)

Spatial Attentive Single-Image Deraining with a High Quality Real Rain Dataset (CVPR'19) Tianyu Wang*, Xin Yang*, Ke Xu, Shaozhe Chen, Qiang Zhang, Ry

Steve Wong 177 Dec 01, 2022
Sarus implementation of classical ML models. The models are implemented using the Keras API of tensorflow 2. Vizualization are implemented and can be seen in tensorboard.

Sarus published models Sarus implementation of classical ML models. The models are implemented using the Keras API of tensorflow 2. Vizualization are

Sarus Technologies 39 Aug 19, 2022
Read and write layered TIFF ImageSourceData and ImageResources tags

Read and write layered TIFF ImageSourceData and ImageResources tags Psdtags is a Python library to read and write the Adobe Photoshop(r) specific Imag

Christoph Gohlke 4 Feb 05, 2022
Code for our paper: Online Variational Filtering and Parameter Learning

Variational Filtering To run phi learning on linear gaussian (Fig1a) python linear_gaussian_phi_learning.py To run phi and theta learning on linear g

16 Aug 14, 2022
A method to perform unsupervised cross-region adaptation of crop classifiers trained with satellite image time series.

TimeMatch Official source code of TimeMatch: Unsupervised Cross-region Adaptation by Temporal Shift Estimation by Joachim Nyborg, Charlotte Pelletier,

Joachim Nyborg 17 Nov 01, 2022
torchlm is aims to build a high level pipeline for face landmarks detection, it supports training, evaluating, exporting, inference(Python/C++) and 100+ data augmentations

💎A high level pipeline for face landmarks detection, supports training, evaluating, exporting, inference and 100+ data augmentations, compatible with torchvision and albumentations, can easily instal

DefTruth 142 Dec 25, 2022
A sequence of Jupyter notebooks featuring the 12 Steps to Navier-Stokes

CFD Python Please cite as: Barba, Lorena A., and Forsyth, Gilbert F. (2018). CFD Python: the 12 steps to Navier-Stokes equations. Journal of Open Sour

Barba group 2.6k Dec 30, 2022
Code for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)

MASTER-PyTorch PyTorch reimplementation of "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021). This projec

Wenwen Yu 255 Dec 29, 2022