Direct design of biquad filter cascades with deep learning by sampling random polynomials.

Related tags

Deep LearningIIRNet
Overview

IIRNet

Direct design of biquad filter cascades with deep learning by sampling random polynomials.

License Open In Colab arXiv

Usage

git clone https://github.com/csteinmetz1/IIRNet.git
pip install .

Filter design

Start designing filters with just a few lines of code. In this example (demos/basic.py ) we create a 32nd order IIR filter to match an arbitrary response that we define over a few points. Internally, this specification will be interpolated to 512 points.

import torch
import numpy as np
import scipy.signal
import matplotlib.pyplot as plt
from iirnet.designer import Designer

# first load IIRNet with pre-trained weights
designer = Designer()

n = 32  # Desired filter order (4, 8, 16, 32, 64)
m = [0, -3, 0, 12, 0, -6, 0]  # Magnitude response specification
mode = "linear"  # interpolation mode for specification
output = "sos"  # Output type ("sos", or "ba")

# now call the designer with parameters
sos = designer(n, m, mode=mode, output=output)

# measure and plot the response
w, h = scipy.signal.sosfreqz(sos.numpy(), fs=2)

# interpolate the target for plotting
m_int = torch.tensor(m).view(1, 1, -1).float()
m_int = torch.nn.functional.interpolate(m_int, 512, mode=mode)

fig, ax = plt.subplots(figsize=(6, 3))
plt.plot(w, 20 * np.log10(np.abs(h)), label="Estimation")
plt.plot(w, m_int.view(-1), label="Specification")
# .... more plotting ....

See demos/basic.py for the full script.

Training

We provide a set of shell scripts that will launch training jobs that reproduce the experiments from the paper in configs/. These should be launched from the top level after installing.

./configs/train_hidden_dim.sh
./configs/filter_method.sh
./configs/filter_order.sh

Evaluation

Running the evaluation will require both the pre-trained models (or models you trained yourself) along with the HRTF and Guitar cabinet datasets. These datasets can be downloaded as follows:

First, change to the data directory and then run the download script.

cd data
./dl.sh

Note, you may need to install 7z if you don't already have it. brew install p7zip on macOS

Next download the pre-trained checkpoints if you haven't already.

mkdir logs
cd logs 
wget https://zenodo.org/record/5550275/files/filter_method.zip
wget https://zenodo.org/record/5550275/files/filter_order.zip
wget https://zenodo.org/record/5550275/files/hidden_dim.zip

unzip filter_method.zip
unzip filter_order.zip
unzip hidden_dim.zip

rm filter_method.zip
rm filter_order.zip
rm hidden_dim.zip

Now you can run the evaluation on checkpoints from the three different experiments as follows.

python eval.py logs/filter_method --yw --sgd --guitar_cab --hrtf --filter_order 16
python eval.py logs/hidden_dim --yw --sgd --guitar_cab --hrtf --filter_order 16

For the filter order experiment we need to run the eval script across all models for every order.

python eval.py logs/filter_order --guitar_cab --hrtf --filter_order 4
python eval.py logs/filter_order --guitar_cab --hrtf --filter_order 8
python eval.py logs/filter_order --guitar_cab --hrtf --filter_order 16
python eval.py logs/filter_order --guitar_cab --hrtf --filter_order 32
python eval.py logs/filter_order --guitar_cab --hrtf --filter_order 64

Note: Requires PyTorch >=1.8

Filter methods

ID Sampling method Name
(A) Normal coefficients normal_poly
(B) Normal biquads normal_biquad
(C) Uniform disk uniform_disk
(D) Uniform magnitude disk uniform_mag_disk
(E) Characteristic char_poly
(F) Uniform parametric uniform_parametric

Citation

 @article{colonel2021iirnet,
    title={Direct design of biquad filter cascades with deep learning by sampling random polynomials},
    author={Colonel, Joseph and Steinmetz, Christian J. and Michelen, Marcus and Reiss, Joshua D.},
    booktitle={arXiv:2110.03691},
    year={2021}}
Owner
Christian J. Steinmetz
Building tools for musicians and audio engineers (often with machine learning). PhD Student at Queen Mary University of London.
Christian J. Steinmetz
Neural Factorization of Shape and Reflectance Under An Unknown Illumination

NeRFactor [Paper] [Video] [Project] This is the authors' code release for: NeRFactor: Neural Factorization of Shape and Reflectance Under an Unknown I

Google 283 Jan 04, 2023
LIVECell - A large-scale dataset for label-free live cell segmentation

LIVECell dataset This document contains instructions of how to access the data associated with the submitted manuscript "LIVECell - A large-scale data

Sartorius Corporate Research 112 Jan 07, 2023
SlotRefine: A Fast Non-Autoregressive Model forJoint Intent Detection and Slot Filling

SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling Reference Main paper to be cited (Di Wu et al., 2020) @article

Moore 34 Nov 03, 2022
SatelliteNeRF - PyTorch-based Neural Radiance Fields adapted to satellite domain

SatelliteNeRF PyTorch-based Neural Radiance Fields adapted to satellite domain.

Kai Zhang 46 Nov 20, 2022
Generalized hybrid model for mode-locked laser diodes with an extended passive cavity

GenHybridMLLmodel Generalized hybrid model for mode-locked laser diodes with an extended passive cavity This hybrid simulation strategy combines a tra

Stijn Cuyvers 3 Sep 21, 2022
Transformer based SAR image despeckling

Transformer based SAR image despeckling Using the code: The code is stable while using Python 3.6.13, CUDA =10.1 Clone this repository: git clone htt

27 Nov 13, 2022
Image morphing without reference points by applying warp maps and optimizing over them.

Differentiable Morphing Image morphing without reference points by applying warp maps and optimizing over them. Differentiable Morphing is machine lea

Alex K 380 Dec 19, 2022
Code for the SIGGRAPH 2021 paper "Consistent Depth of Moving Objects in Video".

Consistent Depth of Moving Objects in Video This repository contains training code for the SIGGRAPH 2021 paper "Consistent Depth of Moving Objects in

Google 203 Jan 05, 2023
Iris prediction model is used to classify iris species created julia's DecisionTree, DataFrames, JLD2, PlotlyJS and Statistics packages.

Iris Species Predictor Iris prediction is used to classify iris species using their sepal length, sepal width, petal length and petal width created us

Siva Prakash 2 Jan 06, 2022
Final project code: Implementing MAE with downscaled encoders and datasets, for ESE546 FA21 at University of Pennsylvania

546 Final Project: Masked Autoencoder Haoran Tang, Qirui Wu 1. Training To train the network, please run mae_pretraining.py. Please modify folder path

Haoran Tang 0 Apr 22, 2022
Entity-Based Knowledge Conflicts in Question Answering.

Entity-Based Knowledge Conflicts in Question Answering Run Instructions | Paper | Citation | License This repository provides the Substitution Framewo

Apple 35 Oct 19, 2022
机器学习、深度学习、自然语言处理等人工智能基础知识总结。

说明 机器学习、深度学习、自然语言处理基础知识总结。 目前主要参考李航老师的《统计学习方法》一书,也有一些内容例如XGBoost、聚类、深度学习相关内容、NLP相关内容等是书中未提及的。

Peter 445 Dec 12, 2022
A Japanese Medical Information Extraction Toolkit

JaMIE: a Japanese Medical Information Extraction toolkit Joint Japanese Medical Problem, Modality and Relation Recognition The Train/Test phrases requ

7 Dec 12, 2022
95.47% on CIFAR10 with PyTorch

Train CIFAR10 with PyTorch I'm playing with PyTorch on the CIFAR10 dataset. Prerequisites Python 3.6+ PyTorch 1.0+ Training # Start training with: py

5k Dec 30, 2022
This respository includes implementations on Manifoldron: Direct Space Partition via Manifold Discovery

Manifoldron: Direct Space Partition via Manifold Discovery This respository includes implementations on Manifoldron: Direct Space Partition via Manifo

dayang_wang 4 Apr 28, 2022
A Python toolbox to create adversarial examples that fool neural networks in PyTorch, TensorFlow, and JAX

Foolbox Native: Fast adversarial attacks to benchmark the robustness of machine learning models in PyTorch, TensorFlow, and JAX Foolbox is a Python li

Bethge Lab 2.4k Dec 25, 2022
This is a Pytorch implementation of paper: DropEdge: Towards Deep Graph Convolutional Networks on Node Classification

DropEdge: Towards Deep Graph Convolutional Networks on Node Classification This is a Pytorch implementation of paper: DropEdge: Towards Deep Graph Con

401 Dec 16, 2022
Pytorch implementation of the AAAI 2022 paper "Cross-Domain Empirical Risk Minimization for Unbiased Long-tailed Classification"

[AAAI22] Cross-Domain Empirical Risk Minimization for Unbiased Long-tailed Classification We point out the overlooked unbiasedness in long-tailed clas

PatatiPatata 28 Oct 18, 2022
Code of Classification Saliency-Based Rule for Visible and Infrared Image Fusion

CSF Code of Classification Saliency-Based Rule for Visible and Infrared Image Fusion Tips: For testing: CUDA_VISIBLE_DEVICES=0 python main.py For trai

Han Xu 14 Oct 31, 2022
Implementation of OpenAI paper with Simple Noise Scale on Fastai V2

README Implementation of OpenAI paper "An Empirical Model of Large-Batch Training" for Fastai V2. The code is based on the batch size finder implement

13 Dec 10, 2021