Spectrum is an AI that uses machine learning to generate Rap song lyrics

Overview

Contributors Forks Stargazers Issues MIT License Open In Colab


Spectrum

Spectrum is an AI that uses deep learning to generate rap song lyrics.

View Demo
Report Bug
Request Feature
Open In Colab

About The Project

Spectrum is an AI that uses deep learning to generate rap song lyrics.

Built With

This project is built using Python, Tensorflow, and Flask.

Getting Started

Installation

# clone the repo
git clone https://github.com/YigitGunduc/Spectrum.git

# install requirements
pip install -r requirements.txt

Training

# navigate to the Spectrum/AI folder 
cd Spectrum/AI

# pass verbose, epochs, save_at arguments and run train.py 
python3 train.py -h, --help  --epochs EPOCHS --save_at SAVE_AT --verbose VERBOSE --rnn_neurons RNN_NEURONS
             --embed_dim EMBED_DIM --dropout DROPOUT --num_layers NUM_LAYERS --learning_rate LEARNING_RATE

All the arguments are optional if you leave them empty model will construct itself with the default params

Generating Text from Trained Model

Call eval.py from the command line with seed text as an argument

python3 eval.py --seed SEEDTEXT

or

from model import Generator

model = Generator()

model.load_weights('../models/model-5-epochs-256-neurons.h5')

generatedText = model.predict(start_seed=SEED, gen_size=1000)

print(generatedText)
  • If you have tweaked the model's parameters while training initialize the model with the parameters you trained

Running the Web-App Locally

# navigate to the Spectrum folder 
cd Spectrum

# run app.py
python3 app.py

# check out http://0.0.0.0:8080

API

spectrum has a free web API you can send request to it as shown below

import requests 

response = requests.get("https://spectrumapp.herokuapp.com/api/generate/SEEDTEXT")
#raw response
print(response.json())
#cleaned up response
print(response.json()["lyrics"])

Hyperparameters

epochs = 30 
batch size = 128
number of layers = 2(hidden) + 1(output)
number of RNN units = 256
dropout prob = 0.3
embedding dimensions = 64
optimizer = Adam
loss = sparse categorical crossentropy

These hyperparameters are the best that I can found but you have to be careful while dealing with the hyperparameters because this model can over or underfit quite easily and GRUs performs better than LSTMs

Info about model

>>> from model import Generator
>>> model = Generator()
>>> model.load_weights('../models/model-5-epochs-256-neurons.h5')
>>> model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding (Embedding)        (1, None, 64)             6400      
_________________________________________________________________
gru (GRU)                    (1, None, 256)            247296    
_________________________________________________________________
gru_1 (GRU)                  (1, None, 256)            394752    
_________________________________________________________________
dense (Dense)                (1, None, 100)            25700     
=================================================================
Total params: 674,148
Trainable params: 674,148
Non-trainable params: 0
_________________________________________________________________

>>> model.hyperparams()
Hyper Parameters
+--------------------------+
|rnn_neurons   |        256|
|embed_dim     |         64|
|learning_rate |     0.0001|
|dropout       |        0.3|
|num_layers    |          2|
+--------------------------+
>>>

Roadmap

See the open issues for a list of proposed features (and known issues).

Results

WARNING: There is some offensive language ahead, please stop reading here if you are a sensitive person. The texts below have been generated by Spectrum

Seed : today

Prediction : 

If that don't, yeah
Weint off the music
It's like a fired-enother foar fool straight for an exactly
Nigga why I id my Door Merican muthafucka

Ng answered by need for blazy hard
The family wish fans dishes rolled up
How better just wanna die
Match all about the moment in I glory
Fire is that attention is the flop and pipe those peokin' distriors
Bitch I been hard and I'm like the Scales me and we're going to school like all-off of the allegit to get the bitches
Yeah kinda too legit back into highin'
A year have it would plobably want

And we all bustin' the conscious in the cusfuckers won't ha
Quite warkie and it's blow, and what? I cannot love him,
Alugal Superman, and the revolution likes migh
I ain't still not I uest the neighborhoo
Powers all too bad show, you crite your bac
When I say way too fathom
If you wanna revell, money, where your face we'll blin
Pulf me very, yo, they pull out for taught nothin' off
I pass a with a nigga hang some, pleas
Fuck me now, it's a

======================================================================
Seed : hello

Prediction : 

hellow motherfucker
You wanna talk on the pockets on Harlotom
I'm legit some more than Volumon
Ridicalab knowledge is blessin' some of your honierby man
We just bust the Flud joke with shoulders on the Statue
Lecock it on everybody want your dices to speak
While she speak cents look back to Pops
He was a nigga when I got behind pictures any Lil Sanvanas
Used to in her lady yaught they never had a bitch
He'll break the jird little rappers kill your children is

I'm prayin' back to ready for that bitch just finished And mised to the gamr
Every eyes on and about that getting common
I'm going to attractived with its
I just went by the crowd get the promise to buy the money-a star big down
Can one sall 'em in me tryna get them days that's how I can break the top
Well, that's hug her hands he screaming like a fucking hip-hop but put a Blidze like rhymin'
Yeah I slack like a Job let your cops got a generres
These West of it today flamping this
Black Kuttle crib, said "Ju Conlie, hold up, fuck the

======================================================================
Seed : bestfriend

Prediction : 

bestfriend
Too much time we tonight
The way I know is a please have no self-back when I be for the fucking weed and a game
What the fuck we wanna be working on the streets make it like a stay down the world is from the head of the real brain
Chain don't come back to the grass
My dick is the one to tell you I'm the fuck
So see me we gon' be fans when you had to hear the window you come to the dick when a little cooleng and I was calling what the fuck is it good as the crown
And I'm representing you finally waitin' in your girl
This is the corner with my brother
I'm just a damn door and the real motherfuckers come got the point my shit is the money on the world

I get it then the conscious that's why I cripp
I might take my own shit so let me have a bad bitch
I'm just had and make the fuck is in the single of the window
I think I ain't got the world is all my gone be mine
They ain't like the half the best between my words
And I'm changing with the heads of the speech
Fuck a bunch of best of a fuck

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Expressive Power of Invariant and Equivaraint Graph Neural Networks (ICLR 2021)

Expressive Power of Invariant and Equivaraint Graph Neural Networks In this repository, we show how to use powerful GNN (2-FGNN) to solve a graph alig

Marc Lelarge 36 Dec 12, 2022
An AI Assistant More Than a Toolkit

tymon An AI Assistant More Than a Toolkit The reason for creating framework tymon is simple. making AI more like an assistant, helping us to complete

TymonXie 46 Oct 24, 2022
Multi-Modal Machine Learning toolkit based on PaddlePaddle.

简体中文 | English PaddleMM 简介 飞桨多模态学习工具包 PaddleMM 旨在于提供模态联合学习和跨模态学习算法模型库,为处理图片文本等多模态数据提供高效的解决方案,助力多模态学习应用落地。 近期更新 2022.1.5 发布 PaddleMM 初始版本 v1.0 特性 丰富的任务

njustkmg 520 Dec 28, 2022
Code for "Contextual Non-Local Alignment over Full-Scale Representation for Text-Based Person Search"

Contextual Non-Local Alignment over Full-Scale Representation for Text-Based Person Search This is an implementation for our paper Contextual Non-Loca

Tencent YouTu Research 50 Dec 03, 2022
A Simple Example for Imitation Learning with Dataset Aggregation (DAGGER) on Torcs Env

Imitation Learning with Dataset Aggregation (DAGGER) on Torcs Env This repository implements a simple algorithm for imitation learning: DAGGER. In thi

Hao 66 Nov 23, 2022
IEGAN — Official PyTorch Implementation Independent Encoder for Deep Hierarchical Unsupervised Image-to-Image Translation

IEGAN — Official PyTorch Implementation Independent Encoder for Deep Hierarchical Unsupervised Image-to-Image Translation Independent Encoder for Deep

30 Nov 05, 2022
Simple and understandable swin-transformer OCR project

swin-transformer-ocr ocr with swin-transformer Overview Simple and understandable swin-transformer OCR project. The model in this repository heavily r

Ha YongWook 67 Dec 31, 2022
This project provides the proof of the uniqueness of the equilibrium and the global asymptotic stability.

Delayed-cellular-neural-network This project provides the proof of the uniqueness of the equilibrium and the global asymptotic stability. There is als

4 Apr 28, 2022
An abstraction layer for mathematical optimization solvers.

MathOptInterface Documentation Build Status Social An abstraction layer for mathematical optimization solvers. Replaces MathProgBase. Citing MathOptIn

JuMP-dev 284 Jan 04, 2023
An educational tool to introduce AI planning concepts using mobile manipulator robots.

JEDAI Explains Decision-Making AI Virtual Machine Image The recommended way of using JEDAI is to use pre-configured Virtual Machine image that is avai

Autonomous Agents and Intelligent Robots 13 Nov 15, 2022
High dimensional black-box optimizer using Latent Action Monte Carlo Tree Search algorithm

LA-MCTS The code is based of paper Learning Search Space Partition for Black-box Optimization using Monte Carlo Tree Search. Component LA-MCTS has thr

Meta Research 18 Oct 24, 2022
Tensorflow implementation and notebooks for Implicit Maximum Likelihood Estimation

tf-imle Tensorflow 2 and PyTorch implementation and Jupyter notebooks for Implicit Maximum Likelihood Estimation (I-MLE) proposed in the NeurIPS 2021

NEC Laboratories Europe 69 Dec 13, 2022
Using LSTM to detect spoofing attacks in an Air-Ground network

Using LSTM to detect spoofing attacks in an Air-Ground network Specifications IDE: Spider Packages: Tensorflow 2.1.0 Keras NumPy Scikit-learn Matplotl

Tiep M. H. 1 Nov 20, 2021
DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generative Transformers

DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generative Transformers Authors: Jaemin Cho, Abhay Zala, and Mohit Bansal (

Jaemin Cho 98 Dec 15, 2022
Code for NeurIPS 2020 article "Contrastive learning of global and local features for medical image segmentation with limited annotations"

Contrastive learning of global and local features for medical image segmentation with limited annotations The code is for the article "Contrastive lea

Krishna Chaitanya 152 Dec 22, 2022
The PyTorch implementation for paper "Neural Texture Extraction and Distribution for Controllable Person Image Synthesis" (CVPR2022 Oral)

ArXiv | Get Start Neural-Texture-Extraction-Distribution The PyTorch implementation for our paper "Neural Texture Extraction and Distribution for Cont

Ren Yurui 111 Dec 10, 2022
This repository contains the source code for the paper First Order Motion Model for Image Animation

!!! Check out our new paper and framework improved for articulated objects First Order Motion Model for Image Animation This repository contains the s

13k Jan 09, 2023
The Deep Learning with Julia book, using Flux.jl.

Deep Learning with Julia DL with Julia is a book about how to do various deep learning tasks using the Julia programming language and specifically the

Logan Kilpatrick 67 Dec 25, 2022
Implementation of various Vision Transformers I found interesting

Implementation of various Vision Transformers I found interesting

Kim Seonghyeon 78 Dec 06, 2022
PyTorch - Python + Nim

Master Release Pytorch - Py + Nim A Nim frontend for pytorch, aiming to be mostly auto-generated and internally using ATen. Because Nim compiles to C+

Giovanni Petrantoni 425 Dec 22, 2022