Multiple-criteria decision-making (MCDM) with Electre, Promethee, Weighted Sum and Pareto

Overview

PyPI version GitHub Issues Contributions welcome License: MIT Downloads

EasyMCDM - Quick Installation methods

Install with PyPI

Once you have created your Python environment (Python 3.6+) you can simply type:

pip3 install EasyMCDM

Install with GitHub

Once you have created your Python environment (Python 3.6+) you can simply type:

git clone https://github.com/qanastek/EasyMCDM.git
cd EasyMCDM
pip3 install -r requirements.txt
pip3 install --editable .

Any modification made to the EasyMCDM package will be automatically interpreted as we installed it with the --editable flag.

Setup with Anaconda

conda create --name EasyMCDM python=3.6 -y
conda activate EasyMCDM

More information on managing environments with Anaconda can be found in the conda cheat sheet.

Try It

Data in tests/data/donnees.csv :

alfa_156,23817,201,8,39.6,6,378,31.2
audi_a4,25771,195,5.7,35.8,7,440,33
cit_xantia,25496,195,7.9,37,2,480,34

Promethee

from EasyMCDM.models.Promethee import Promethee

data = pd.read_csv('tests/data/donnees.csv', header=None).to_numpy()
# or
data = {
  "alfa_156": [23817.0, 201.0, 8.0, 39.6, 6.0, 378.0, 31.2],
  "audi_a4": [25771.0, 195.0, 5.7, 35.8, 7.0, 440.0, 33.0],
  "cit_xantia": [25496.0, 195.0, 7.9, 37.0, 2.0, 480.0, 34.0]
}
weights = [0.14,0.14,0.14,0.14,0.14,0.14,0.14]
prefs = ["min","max","min","min","min","max","min"]

p = Promethee(data=data, verbose=False)
res = p.solve(weights=weights, prefs=prefs)
print(res)

Output :

{
  'phi_negative': [('rnlt_safrane', 2.381), ('vw_passat', 2.9404), ('bmw_320d', 3.3603), ('saab_tid', 3.921), ('audi_a4', 4.34), ('cit_xantia', 4.48), ('rnlt_laguna', 5.04), ('alfa_156', 5.32), ('peugeot_406', 5.461), ('cit_xsara', 5.741)],
  'phi_positive': [('rnlt_safrane', 6.301), ('vw_passat', 5.462), ('bmw_320d', 5.18), ('saab_tid', 4.76), ('audi_a4', 4.0605), ('cit_xantia', 3.921), ('rnlt_laguna', 3.6406), ('alfa_156', 3.501), ('peugeot_406', 3.08), ('cit_xsara', 3.08)],
  'phi': [('rnlt_safrane', 3.92), ('vw_passat', 2.5214), ('bmw_320d', 1.8194), ('saab_tid', 0.839), ('audi_a4', -0.27936), ('cit_xantia', -0.5596), ('rnlt_laguna', -1.3995), ('alfa_156', -1.8194), ('peugeot_406', -2.381), ('cit_xsara', -2.661)],
  'matrix': '...'
}

Electre Iv / Is

from EasyMCDM.models.Electre import Electre

data = {
    "A1" : [80, 90,  600, 5.4,  8,  5],
    "A2" : [65, 58,  200, 9.7,  1,  1],
    "A3" : [83, 60,  400, 7.2,  4,  7],
    "A4" : [40, 80, 1000, 7.5,  7, 10],
    "A5" : [52, 72,  600, 2.0,  3,  8],
    "A6" : [94, 96,  700, 3.6,  5,  6],
}
weights = [0.1, 0.2, 0.2, 0.1, 0.2, 0.2]
prefs = ["min", "max", "min", "min", "min", "max"]
vetoes = [45, 29, 550, 6, 4.5, 4.5]
indifference_threshold = 0.6
preference_thresholds = [20, 10, 200, 4, 2, 2] # or None for Electre Iv

e = Electre(data=data, verbose=False)

results = e.solve(weights, prefs, vetoes, indifference_threshold, preference_thresholds)

Output :

{'kernels': ['A4', 'A5']}

Pareto

from EasyMCDM.models.Pareto import Pareto

data = 'tests/data/donnees.csv'
# or
data = {
  "alfa_156": [23817.0, 201.0, 8.0, 39.6, 6.0, 378.0, 31.2],
  "audi_a4": [25771.0, 195.0, 5.7, 35.8, 7.0, 440.0, 33.0],
  "cit_xantia": [25496.0, 195.0, 7.9, 37.0, 2.0, 480.0, 34.0]
}

p = Pareto(data=data, verbose=False)
res = p.solve(indexes=[0,1,6], prefs=["min","max","min"])
print(res)

Output :

{
  'alfa_156': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'audi_a4': {'Weakly-dominated-by': ['alfa_156'], 'Dominated-by': ['alfa_156']}, 
  'cit_xantia': {'Weakly-dominated-by': ['alfa_156', 'vw_passat'], 'Dominated-by': ['alfa_156']},
  'peugeot_406': {'Weakly-dominated-by': ['alfa_156', 'cit_xantia', 'rnlt_laguna', 'vw_passat'], 'Dominated-by': ['alfa_156', 'cit_xantia', 'rnlt_laguna', 'vw_passat']},
  'saab_tid': {'Weakly-dominated-by': ['alfa_156'], 'Dominated-by': ['alfa_156']}, 
  'rnlt_laguna': {'Weakly-dominated-by': ['vw_passat'], 'Dominated-by': ['vw_passat']}, 
  'vw_passat': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'bmw_320d': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'cit_xsara': {'Weakly-dominated-by': [], 'Dominated-by': []},
  'rnlt_safrane': {'Weakly-dominated-by': ['bmw_320d'], 'Dominated-by': ['bmw_320d']}
}

Weighted Sum

from EasyMCDM.models.WeightedSum import WeightedSum

data = 'tests/data/donnees.csv'
# or
data = {
  "alfa_156": [23817.0, 201.0, 8.0, 39.6, 6.0, 378.0, 31.2],
  "audi_a4": [25771.0, 195.0, 5.7, 35.8, 7.0, 440.0, 33.0],
  "cit_xantia": [25496.0, 195.0, 7.9, 37.0, 2.0, 480.0, 34.0]
}

p = WeightedSum(data=data, verbose=False)
res = p.solve(pref_indexes=[0,1,6],prefs=["min","max","min"], weights=[0.001,2,3], target='min')
print(res)

Output :

[(1, 'bmw_320d', -299.04), (2, 'alfa_156', -284.58299999999997), (3, 'rnlt_safrane', -280.84), (4, 'saab_tid', -275.817), (5, 'vw_passat', -265.856), (6, 'audi_a4', -265.229), (7, 'rnlt_laguna', -262.93600000000004), (8, 'cit_xantia', -262.504), (9, 'peugeot_406', -252.551), (10, 'cit_xsara', -244.416)]

Instant-Runoff Multicriteria Optimization (IRMO)

Short description : Eliminate the worst individual for each criteria, until we reach the last one and select the best one.

from EasyMCDM.models.Irmo import Irmo

p = Irmo(data="data/donnees.csv", verbose=False)
res = p.solve(
    indexes=[0,1,4,5], # price -> max_speed -> comfort -> trunk_space
    prefs=["min","max","min","max"]
)
print(res)

Output :

{'best': 'saab_tid'}

List of methods available

Build PyPi package

Build: python setup.py sdist bdist_wheel

Upload: twine upload dist/*

Citation

If you want to cite the tool you can use this:

@misc{EasyMCDM,
  title={EasyMCDM},
  author={Yanis Labrak, Quentin Raymondaud, Philippe Turcotte},
  publisher={GitHub},
  journal={GitHub repository},
  howpublished={\url{https://github.com/qanastek/EasyMCDM}},
  year={2022}
}
Owner
Labrak Yanis
👨🏻‍🎓 Student in Master of Science in Computer Science, Avignon University 🇫🇷 🏛 Research Scientist - Machine Learning in Healthcare
Labrak Yanis
An implementation of Fastformer: Additive Attention Can Be All You Need in TensorFlow

Fast Transformer This repo implements Fastformer: Additive Attention Can Be All You Need by Wu et al. in TensorFlow. Fast Transformer is a Transformer

Rishit Dagli 139 Dec 28, 2022
An self sufficient AI that crawls the web to learn how to generate art from keywords

Roxx-IO - The Smart Artist AI! TO DO / IDEAS Implement Web-Scraping Functionality Figure out a less annoying (and an off button for it) text to speech

Tatz 5 Mar 21, 2022
Pytorch implementation of the paper Improving Text-to-Image Synthesis Using Contrastive Learning

T2I_CL This is the official Pytorch implementation of the paper Improving Text-to-Image Synthesis Using Contrastive Learning Requirements Linux Python

42 Dec 31, 2022
An end-to-end framework for mixed-integer optimization with data-driven learned constraints.

OptiCL OptiCL is an end-to-end framework for mixed-integer optimization (MIO) with data-driven learned constraints. We address a problem setting in wh

Holly Wiberg 57 Dec 26, 2022
Pure python implementations of popular ML algorithms.

Minimal ML algorithms This repo includes minimal implementations of popular ML algorithms using pure python and numpy. The purpose of these notebooks

Alexis Gidiotis 3 Jan 10, 2022
Rasterize with the least efforts for researchers.

utils3d Rasterize and do image-based 3D transforms with the least efforts for researchers. Based on numpy and OpenGL. It could be helpful when you wan

Ruicheng Wang 8 Dec 15, 2022
REBEL: Relation Extraction By End-to-end Language generation

REBEL: Relation Extraction By End-to-end Language generation This is the repository for the Findings of EMNLP 2021 paper REBEL: Relation Extraction By

Babelscape 222 Jan 06, 2023
This repository contains a re-implementation of the code for the CVPR 2021 paper "Omnimatte: Associating Objects and Their Effects in Video."

Omnimatte in PyTorch This repository contains a re-implementation of the code for the CVPR 2021 paper "Omnimatte: Associating Objects and Their Effect

Erika Lu 728 Dec 28, 2022
Pytorch implementation of the paper SPICE: Semantic Pseudo-labeling for Image Clustering

SPICE: Semantic Pseudo-labeling for Image Clustering By Chuang Niu and Ge Wang This is a Pytorch implementation of the paper. (In updating) SOTA on 5

Chuang Niu 154 Dec 15, 2022
Python script for performing depth completion from sparse depth and rgb images using the msg_chn_wacv20. model in Tensorflow Lite.

TFLite-msg_chn_wacv20-depth-completion Python script for performing depth completion from sparse depth and rgb images using the msg_chn_wacv20. model

Ibai Gorordo 2 Oct 04, 2021
Punctuation Restoration using Transformer Models for High-and Low-Resource Languages

Punctuation Restoration using Transformer Models This repository contins official implementation of the paper Punctuation Restoration using Transforme

Tanvirul Alam 142 Jan 01, 2023
Gauge equivariant mesh cnn

Geometric Mesh CNN The code in this repository is an implementation of the Gauge Equivariant Mesh CNN introduced in the paper Gauge Equivariant Mesh C

50 Dec 18, 2022
Code for ViTAS_Vision Transformer Architecture Search

Vision Transformer Architecture Search This repository open source the code for ViTAS: Vision Transformer Architecture Search. ViTAS aims to search fo

46 Dec 17, 2022
Python Blood Vessel Topology Analysis

Python Blood Vessel Topology Analysis This repository is not being updated anymore. The new version of PyVesTo is called PyVaNe and is available at ht

6 Nov 15, 2022
Deformable DETR is an efficient and fast-converging end-to-end object detector.

Deformable DETR: Deformable Transformers for End-to-End Object Detection.

2k Jan 05, 2023
Code for "LoRA: Low-Rank Adaptation of Large Language Models"

LoRA: Low-Rank Adaptation of Large Language Models This repo contains the implementation of LoRA in GPT-2 and steps to replicate the results in our re

Microsoft 394 Jan 08, 2023
Learning kernels to maximize the power of MMD tests

Code for the paper "Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy" (arXiv:1611.04488; published at ICLR 2017), by Douga

Danica J. Sutherland 201 Dec 17, 2022
Traffic4D: Single View Reconstruction of Repetitious Activity Using Longitudinal Self-Supervision

Traffic4D: Single View Reconstruction of Repetitious Activity Using Longitudinal Self-Supervision Project | PDF | Poster Fangyu Li, N. Dinesh Reddy, X

25 Dec 21, 2022
Learning recognition/segmentation models without end-to-end training. 40%-60% less GPU memory footprint. Same training time. Better performance.

InfoPro-Pytorch The Information Propagation algorithm for training deep networks with local supervision. (ICLR 2021) Revisiting Locally Supervised Lea

78 Dec 27, 2022