A web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python.

Overview

Xcessiv

PyPI license PyPI Build Status

Xcessiv is a tool to help you create the biggest, craziest, and most excessive stacked ensembles you can think of.

Stacked ensembles are simple in theory. You combine the predictions of smaller models and feed those into another model. However, in practice, implementing them can be a major headache.

Xcessiv holds your hand through all the implementation details of creating and optimizing stacked ensembles so you're free to fully define only the things you care about.

The Xcessiv process

Define your base learners and performance metrics

define_base_learner

Keep track of hundreds of different model-hyperparameter combinations

list_base_learner

Effortlessly choose your base learners and create an ensemble with the click of a button

ensemble

Features

  • Fully define your data source, cross-validation process, relevant metrics, and base learners with Python code
  • Any model following the Scikit-learn API can be used as a base learner
  • Task queue based architecture lets you take full advantage of multiple cores and embarrassingly parallel hyperparameter searches
  • Direct integration with TPOT for automated pipeline construction
  • Automated hyperparameter search through Bayesian optimization
  • Easy management and comparison of hundreds of different model-hyperparameter combinations
  • Automatic saving of generated secondary meta-features
  • Stacked ensemble creation in a few clicks
  • Automated ensemble construction through greedy forward model selection
  • Export your stacked ensemble as a standalone Python file to support multiple levels of stacking

Installation and Documentation

You can find installation instructions and detailed documentation hosted here.

FAQ

Where does Xcessiv fit in the machine learning process?

Xcessiv fits in the model building part of the process after data preparation and feature engineering. At this point, there is no universally acknowledged way of determining which algorithm will work best for a particular dataset (see No Free Lunch Theorem), and while heuristic optimization methods do exist, things often break down into trial and error as you try to find the best model-hyperparameter combinations.

Stacking is an almost surefire method to improve performance beyond that of any single model, however, the complexity of proper implementation often makes it impractical to apply them in practice outside of Kaggle competitions. Xcessiv aims to make the construction of stacked ensembles as painless as possible and lower the barrier for entry.

I don't care about fancy stacked ensembles and what not, should I still use Xcessiv?

Absolutely! Even without the ensembling functionality, the sheer amount of utility provided by keeping track of the performance of hundreds, and even thousands of ML models and hyperparameter combinations is a huge boon.

How does Xcessiv generate meta-features for stacking?

You can choose whether to generate meta-features through cross-validation (stacked generalization) or with a holdout set (blending). You can read about these two methods and a lot more about stacked ensembles in the Kaggle Ensembling Guide. It's a great article and provides most of the inspiration for this project.

Contributing

Xcessiv is in its very early stages and needs the open-source community to guide it along.

There are many ways to contribute to Xcessiv. You could report a bug, suggest a feature, submit a pull request, improve documentation, and many more.

If you would like to contribute something, please visit our Contributor Guidelines.

Project Status

Xcessiv is currently in alpha and is unstable. Future versions are not guaranteed to be backwards-compatible with current project files.

Comments
  • Can't Use

    Can't Use

    Sorry for what is no doubt a stupid question:

    I've started Redis via redis-server. It says it's running on port 6379. Then I run xcessiv, but it takes me to a page that's not found. The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again.. Any idea what I can do? I'm really eager to use Xcessiv.

    opened by xnmp 6
  • Automated ensembling techniques

    Automated ensembling techniques

    Working for a while with Xcessiv, I feel there's a need for some way to automate the selection of base learners in an ensemble. I'm unaware of existing techniques for this, so if anyone has any suggestions or could point me towards relevant literature, it would be greatly appreciated.

    enhancement 
    opened by reiinakano 5
  • Added more of the sklearn regressors to the presets

    Added more of the sklearn regressors to the presets

    Added the large majority of the more popular regressors of sklearn. I am aware that a few may be missing. Also, I tidied the code slightly and split the regressors and classifiers into two sections.

    opened by enisnazif 4
  • Memory management

    Memory management

    First of all, thanks! I find this project fascinating. My question/issue is about how do you handle the memory for multiple processes. By default Python will create a copy of the data per process. This is prohibitive for large datasets.

    How did you manage this problem?

    opened by alvarouc 3
  • Move .gitignore to project root and add Python ignores

    Move .gitignore to project root and add Python ignores

    I think the best practice for .gitignore is to have a single .gitignore file at the root of the project so I moved the .gitignore that was in xcessiv/ui (I think it was generated by create-react-scripts) to the project root and added some Python ignore lines.

    opened by menglewis 3
  • Added Leave One Out Crossvalidation to cvsetting.py

    Added Leave One Out Crossvalidation to cvsetting.py

    Added Leave One Out Cross validation as part of #15

    I'm keen to finish implementing all of the cv / metrics within sklearn, just wanted to make sure I was doing it right since this is my first pull request!

    opened by enisnazif 2
  • XGBRegressor model stuck in queued status

    XGBRegressor model stuck in queued status

    I tried to make a regression model to run on zillow data from kaggle available here https://www.kaggle.com/c/zillow-prize-1/data Here is a gist of my dataset extractor as well as setting up the XGBRegressor and an exception that was the last thing left in the console https://gist.github.com/jef5ez/a9b0650293f343682a58b0f0500f3332 I selected the shuffle split for both cross validation settings and added MSE as the learner metric. The base learner seems to verify fine on the boston housing data. After hitting finalize and selecting a single base learner a row shows up below but is stuck in the Queued status.

    python 3.5.2 xcessiv (0.2.2) xgboost (0.6a2)

    opened by jef5ez 2
  • The _BasePipeline in exported Python script should be _BaseComposition

    The _BasePipeline in exported Python script should be _BaseComposition

    Since scitkit-learn 0.19.x, the base class for Pipeline has changed to _BaseComposition. https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/pipeline.py When using the generated code for training, it raises a name-not-found error on newer versions of sklearn. At the moment, an easy workaround is to change two instances of the word manually in the generated script.

    opened by Mithrillion 1
  • Issues with TfidnVectorizer

    Issues with TfidnVectorizer

    Hey, great tool.

    I have a problem though when I am trying to use a TfidfVectorizer for Text Classification. When I create a Single Base Learner I get the error:

    ValueError: all the input array dimensions except for the concatenation axis must match exactly .

    The type of the X variable is an numpy.ndarray, but if I don't convert the variable X to an array then I get the error message:

    TypeError: Singleton array array(<92820x194 sparse matrix of type '<class 'numpy.float64'>' with 92820 stored elements in Compressed Sparse Row format>, dtype=object) cannot be considered a valid collection.

    I choose the preset learner setting scikit-learn Random Forest as a Base Learner Type.

    import os
    import numpy as np
    import pandas as pd
    import pickle
    from sklearn.feature_extraction.text import TfidfVectorizer
    from sklearn.ensemble import RandomForestClassifier
    
    def extract_main_dataset():
        # pandas data frame with the columns Classification, FeatureVector
        # ie:
        # 0, 'This is the feature vector'
        # 1, 'This is another feature vector' 
        # 2, 'This is yet another feature vector' 
        # 1, 'This is the last feature vector example' 
        with open('feature_vector.pik', 'rb') as rf:
            feature_vector = pickle.load(rf)
    
        y = np.array(feature_vector.Classification.values)
        title_rf_vectorizer = TfidfVectorizer(ngram_range=(2, 9),
                                              sublinear_tf=True,
                                              use_idf=True,
                                              strip_accents='ascii')
    
        title_rf_classifier = RandomForestClassifier(n_estimators=100, n_jobs=8)
        X = title_rf_vectorizer.fit_transform(feature_vector["Classification"]).toarray()
        return X, y
    
    opened by bbowler86 1
  • Valid values for metric to optimise in bayesian optimisation?

    Valid values for metric to optimise in bayesian optimisation?

    Is there a list of valid metric_to_optimise for Bayesian Optimisation?

    I am using sklearn mean_squared_regression for my base learning but when I enter that into the Bayesian Optimisation menu under metric_to_optimise I get:

    assert module.metric_to_optimize in automated_run.base_learner_origin.metric_generators
    AssertionError
    
    question 
    opened by Data-drone 1
  • 'dict_keys' object does not support indexing

    'dict_keys' object does not support indexing

    On lines 306 and 309 of views.py, trying to index a dictionary keys object will fail on Python 3 and result in a server error. The fix is simple: change all occurrences of

    base_learner_origin.validation_results.keys()[0]

    to

    list(base_learner_origin.validation_results.keys())[0]

    opened by KhaledSharif 1
  • redis.exceptions.DataError at xcessiv launch

    redis.exceptions.DataError at xcessiv launch

    Hello, When I try to launch xcessiv I get an error:

    Traceback (most recent call last): File "/PATH_TO/anaconda3/bin/xcessiv", line 10, in <module> sys.exit(main()) File "/PATH_TO/anaconda3/lib/python3.7/site-packages/xcessiv/scripts/runapp.py", line 51, in main redis_conn.get(None) # will throw exception if Redis is unavailable File "/PATH_TO/anaconda3/lib/python3.7/site-packages/redis/client.py", line 1264, in get return self.execute_command('GET', name) File "/PATH_TO/anaconda3/lib/python3.7/site-packages/redis/client.py", line 774, in execute_command connection.send_command(*args) File "/PATH_TO/anaconda3/lib/python3.7/site-packages/redis/connection.py", line 620, in send_command self.send_packed_command(self.pack_command(*args)) File "/PATH_TO/anaconda3/lib/python3.7/site-packages/redis/connection.py", line 663, in pack_command for arg in imap(self.encoder.encode, args): File "/PATH_TO/anaconda3/lib/python3.7/site-packages/redis/connection.py", line 125, in encode "byte, string or number first." % typename) redis.exceptions.DataError: Invalid input of type: 'NoneType'. Convert to a byte, string or number first.

    Previously I had to change from gevent.wsgi import WSGIServer to from gevent.pywsgi import WSGIServer as indicated in this issue

    My server is responding when I do redis-cli ping

    I am on Ubuntu 18.04, with python 3.7.3 and redis 5.0.5

    Do you have an idea to fix this? Thanks!

    opened by AlexCoul 0
  • How to import homemade modules in Xcessiv?

    How to import homemade modules in Xcessiv?

    I'm trying to import homemade module named preprocessing_115v (filename preprocessing_115v.py) into the main data extraction source code but I can't seem to find it :

    ############# import preprocessing_115v <-- where do I store the preprocessing_115v.py file for it to load here? def extract_main_dataset(): import pandas as pd df=pd.read_csv('./data.csv', sep=',',header=None) X=df.values labels=pd.read_csv('./labelsnum.csv', sep=',',header=None) y=labels.values y=y[:,0] return X, y ##############

    Amazing program by the way :-)

    opened by fcoppey 0
  • xcessiv server

    xcessiv server

    Hi This project looks very cool, but I am having some problems with the setup. I am running this in a container (my own), and I can't get the server to show up. From inside the container I can see the server running - ps shows xcessiv running and curl localhost:1994 gives me some HTML from xcessiv. From outside the container, however, there's nothing.

    I suppose that's down to the server.py file which I have now changed to this:

      1 from __future__ import absolute_import, print_function, division, unicode_literals¬                                                                              
      2 from gevent.wsgi import WSGIServer¬
      3 # import webbrowser¬
      4 ¬
      5 ¬
      6 def launch(app):¬
      7     http_server = WSGIServer(('0.0.0.0', app.config['XCESSIV_PORT']), app)¬
      8     # webbrowser.open_new('http://localhost:' + str(app.config['XCESSIV_PORT']))¬
      9     http_server.serve_forever()¬
    

    I have changed the WSGIServer setup to be open to outside connection (I suppose that's what I changed), but it's still not showing up.

    Feedback appreciated. I'd like to try this out. Thanks!

    opened by benman1 0
  • Feature Request - Backup .db file

    Feature Request - Backup .db file

    I got an error something to the effect of "Error with JSON "N" at position 8345", presumably caused by my manually editing the code for one of the base learners. Once I got this error however, none of the base learners in my project would load. I resolved it by manually deleting the base learner I had been editing from the .db file. I'll post the specifics if I can recreate it, but I'm wondering if it might be prudent to have some kind of db backup/"Last Known Good Configuration"?

    opened by Tahlor 0
  • Fix issue #63 no module named wsgi

    Fix issue #63 no module named wsgi

    In file server.py

    from gevent.wsgi import WSGIServer
    

    Has to be changed to:

    from gevent.pywsgi import WSGIServer
    

    http://www.gevent.org/api/gevent.pywsgi.html

    opened by KhaledTo 0
  • ImportError: No module named wsgi

    ImportError: No module named wsgi

    File "/usr/local/Cellar/python/2.7.14/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/xcessiv/server.py", line 2, in from gevent.wsgi import WSGIServer ImportError: No module named wsgi

    opened by xialeizhou 2
Releases(v0.5.1)
Owner
Reiichiro Nakano
I like working on awesome things with awesome people!
Reiichiro Nakano
Deep Implicit Moving Least-Squares Functions for 3D Reconstruction

DeepMLS: Deep Implicit Moving Least-Squares Functions for 3D Reconstruction This repository contains the implementation of the paper: Deep Implicit Mo

103 Dec 22, 2022
Efficiently Disentangle Causal Representations

Efficiently Disentangle Causal Representations Install dependency pip install -r requirements.txt Main experiments Causality direction prediction cd

4 Apr 01, 2022
Object DGCNN and DETR3D, Our implementations are built on top of MMdetection3D.

This repo contains the implementations of Object DGCNN (https://arxiv.org/abs/2110.06923) and DETR3D (https://arxiv.org/abs/2110.06922). Our implementations are built on top of MMdetection3D.

Wang, Yue 539 Jan 07, 2023
A repository for interferometer controller code.

dses-interferometer-controller A repository for interferometer controller code, hardware, and simulations. See dses.science for more information on th

Eli Reed 1 Jan 17, 2022
PyTorch evaluation code for Delving Deep into the Generalization of Vision Transformers under Distribution Shifts.

Out-of-distribution Generalization Investigation on Vision Transformers This repository contains PyTorch evaluation code for Delving Deep into the Gen

Chongzhi Zhang 72 Dec 13, 2022
Really awesome semantic segmentation

really-awesome-semantic-segmentation A list of all papers on Semantic Segmentation and the datasets they use. This site is maintained by Holger Caesar

Holger Caesar 400 Nov 28, 2022
Voxel Set Transformer: A Set-to-Set Approach to 3D Object Detection from Point Clouds (CVPR 2022)

Voxel Set Transformer: A Set-to-Set Approach to 3D Object Detection from Point Clouds (CVPR2022)[paper] Authors: Chenhang He, Ruihuang Li, Shuai Li, L

Billy HE 141 Dec 30, 2022
Jetson Nano-based smart camera system that measures crowd face mask usage in real-time.

MaskCam MaskCam is a prototype reference design for a Jetson Nano-based smart camera system that measures crowd face mask usage in real-time, with all

BDTI 212 Dec 29, 2022
Transformers are Graph Neural Networks!

🚀 Gated Graph Transformers Gated Graph Transformers for graph-level property prediction, i.e. graph classification and regression. Associated article

Chaitanya Joshi 46 Jun 30, 2022
SelfRemaster: SSL Speech Restoration

SelfRemaster: Self-Supervised Speech Restoration Official implementation of SelfRemaster: Self-Supervised Speech Restoration with Analysis-by-Synthesi

Takaaki Saeki 46 Jan 07, 2023
Multi-modal Text Recognition Networks: Interactive Enhancements between Visual and Semantic Features

Multi-modal Text Recognition Networks: Interactive Enhancements between Visual and Semantic Features | paper | Official PyTorch implementation for Mul

48 Dec 28, 2022
Source Code for Simulations in the Publication "Can the brain use waves to solve planning problems?"

Code for Simulations in the Publication Can the brain use waves to solve planning problems? Installing Required Python Packages Please use Python vers

EMD Group 2 Jul 01, 2022
CapsuleVOS: Semi-Supervised Video Object Segmentation Using Capsule Routing

CapsuleVOS This is the code for the ICCV 2019 paper CapsuleVOS: Semi-Supervised Video Object Segmentation Using Capsule Routing. Arxiv Link: https://a

53 Oct 27, 2022
MVGCN: a novel multi-view graph convolutional network (MVGCN) framework for link prediction in biomedical bipartite networks.

MVGCN MVGCN: a novel multi-view graph convolutional network (MVGCN) framework for link prediction in biomedical bipartite networks. Developer: Fu Hait

13 Dec 01, 2022
Learning Correspondence from the Cycle-consistency of Time (CVPR 2019)

TimeCycle Code for Learning Correspondence from the Cycle-consistency of Time (CVPR 2019, Oral). The code is developed based on the PyTorch framework,

Xiaolong Wang 706 Nov 29, 2022
Explainability for Vision Transformers (in PyTorch)

Explainability for Vision Transformers (in PyTorch) This repository implements methods for explainability in Vision Transformers

Jacob Gildenblat 442 Jan 04, 2023
CVPR2022 paper "Dense Learning based Semi-Supervised Object Detection"

[CVPR2022] DSL: Dense Learning based Semi-Supervised Object Detection DSL is the first work on Anchor-Free detector for Semi-Supervised Object Detecti

Bhchen 69 Dec 08, 2022
Classification Modeling: Probability of Default

Credit Risk Modeling in Python Introduction: If you've ever applied for a credit card or loan, you know that financial firms process your information

Aktham Momani 2 Nov 07, 2022
Multi-Joint dynamics with Contact. A general purpose physics simulator.

MuJoCo Physics MuJoCo stands for Multi-Joint dynamics with Contact. It is a general purpose physics engine that aims to facilitate research and develo

DeepMind 5.2k Jan 02, 2023
Fortuitous Forgetting in Connectionist Networks

Fortuitous Forgetting in Connectionist Networks Introduction This repository includes reference code for the paper Fortuitous Forgetting in Connection

Hattie Zhou 14 Nov 26, 2022