An implementation of Deep Forest 2021.2.1.

Overview

Deep Forest (DF) 21

github readthedocs codecov python pypi style

DF21 is an implementation of Deep Forest 2021.2.1. It is designed to have the following advantages:

  • Powerful: Better accuracy than existing tree-based ensemble methods.
  • Easy to Use: Less efforts on tunning parameters.
  • Efficient: Fast training speed and high efficiency.
  • Scalable: Capable of handling large-scale data.

Whenever one used tree-based machine learning approaches such as Random Forest or GBDT, DF21 may offer a new powerful option.

For a quick start, please refer to How to Get Started. For a detailed guidance on parameter tunning, please refer to Parameters Tunning.

Installation

The package is available via PyPI using:

pip install deep-forest

Quickstart

from sklearn.datasets import load_digits
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score

from deepforest import CascadeForestClassifier

X, y = load_digits(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=1)
model = CascadeForestClassifier(random_state=1)
model.fit(X_train, y_train)
y_pred = model.predict(X_test)
acc = accuracy_score(y_test, y_pred) * 100
print("\nTesting Accuracy: {:.3f} %".format(acc))
>>> Testing Accuracy: 98.667 %

Resources

Reference

@article{zhou2019deep,
    title={Deep forest},
    author={Zhi-Hua Zhou and Ji Feng},
    journal={National Science Review},
    volume={6},
    number={1},
    pages={74--86},
    year={2019}}

@inproceedings{zhou2017deep,
    Author = {Zhi-Hua Zhou and Ji Feng},
    Booktitle = {IJCAI},
    Pages = {3553-3559},
    Title = {{Deep Forest:} Towards an alternative to deep neural networks},
    Year = {2017}}

Acknowledgement

The lead developer and maintainer of DF21 is Mr. Yi-Xuan Xu. Before the release, it has been used internally in the LAMDA Group, Nanjing University, China.

Comments
  • Custom CascadeForestClassifier

    Custom CascadeForestClassifier

    Hey,

    Thanks for your awesome repo.

    I have a question if you don't mind could you please give me an example on how to change RandomForestClassifier and ExtraTreesClassifier in the CascadeForestClassifier?

    opened by Maryom 31
  • Starting the interpretability of the Deep Forest using SHAP

    Starting the interpretability of the Deep Forest using SHAP

    Hey,

    This is an initial implementation, however I'm not sure it will work I see that we will get the following error:

    AttributeError: 'CascadeForestClassifier' object has no attribute 'estimators_'

    What do you think @xuyxu ?

    opened by Maryom 25
  • Error: could not allocate 0 bytes

    Error: could not allocate 0 bytes

    When I was using this package, I experienced the following problem. According to my observation, there is still a lot of available memory. Thus, what's the problem?

      File "deepforest/tree/_tree.pyx", line 123, in deepforest.tree._tree.DepthFirstTreeBuilder.build
      File "deepforest/tree/_tree.pyx", line 256, in deepforest.tree._tree.DepthFirstTreeBuilder.build
      File "deepforest/tree/_tree.pyx", line 480, in deepforest.tree._tree.Tree._resize_node_c
      File "deepforest/tree/_utils.pyx", line 34, in deepforest.tree._utils.safe_realloc
    MemoryError: could not allocate 0 bytes
    
    bug 
    opened by hengzhe-zhang 23
  • Add support for pandas.DataFrame and list in `fit`

    Add support for pandas.DataFrame and list in `fit`

    Now, the fit method only support np.array for input. However, most ml algorithms with scikit-learn-Compatible API (i.e. XGBoost , NGBoost) support DataFrame object or List or numpy array of predictors (n x p) in numeric format using sklearn.utils.check_array . This RP is to make Deep-Forest consistent with other algorithms so that it will be more easy to be used in other integrated machine learning frameworks (i.e. PyCaret).

    opened by IncubatorShokuhou 13
  • [BUG] `CascadeForestRegressor` somehow cannot be inserted into a DataFrame

    [BUG] `CascadeForestRegressor` somehow cannot be inserted into a DataFrame

    Describe the bug CascadeForestRegressor somehow cannot be inserted into a DataFrame

    To Reproduce

    import pandas as pd
    from deepforest import CascadeForestRegressor
    from ngboost import NGBRegressor
    
    ngr = NGBRegressor()  # ngboost regressor for example. xgb, lgb should also be no problem.
    cfr = CascadeForestRegressor()
    df= pd.DataFrame()
    
    # somehow OK
    df.insert(0, "ngr", [ngr])
    # somehow error
    df.insert(0, "cf", [cforest])
    

    Expected behavior No error

    Additional context

    ValueError                                Traceback (most recent call last)
    <ipython-input-32-ab0139d10254> in <module>
    ----> 1 df.insert(0, "cf", [cforest])
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/frame.py in insert(self, loc, column, value, allow_duplicates)
       3760             )
       3761         self._ensure_valid_index(value)
    -> 3762         value = self._sanitize_column(column, value, broadcast=False)
       3763         self._mgr.insert(loc, column, value, allow_duplicates=allow_duplicates)
       3764 
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/frame.py in _sanitize_column(self, key, value, broadcast)
       3900             if not isinstance(value, (np.ndarray, Index)):
       3901                 if isinstance(value, list) and len(value) > 0:
    -> 3902                     value = maybe_convert_platform(value)
       3903                 else:
       3904                     value = com.asarray_tuplesafe(value)
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/dtypes/cast.py in maybe_convert_platform(values)
        110     """ try to do platform conversion, allow ndarray or list here """
        111     if isinstance(values, (list, tuple, range)):
    --> 112         values = construct_1d_object_array_from_listlike(values)
        113     if getattr(values, "dtype", None) == np.object_:
        114         if hasattr(values, "_values"):
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/dtypes/cast.py in construct_1d_object_array_from_listlike(values)
       1636     # making a 1D array that contains list-likes is a bit tricky:
       1637     result = np.empty(len(values), dtype="object")
    -> 1638     result[:] = values
       1639     return result
       1640 
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/deepforest/cascade.py in __getitem__(self, index)
        518 
        519     def __getitem__(self, index):
    --> 520         return self._get_layer(index)
        521 
        522     def _get_n_output(self, y):
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/deepforest/cascade.py in _get_layer(self, layer_idx)
        561             logger.debug("self.n_layers_ = "+ str(self.n_layers_))
        562             logger.debug("layer_idx = "+ str(layer_idx))
    --> 563             raise ValueError(msg.format(self.n_layers_ - 1, layer_idx))
        564 
        565         layer_key = "layer_{}".format(layer_idx)
    
    ValueError: The layer index should be in the range [0, 1], but got 2 instead.
    

    This bug can be simpliy fixed if we change if not 0 <= layer_idx < self.n_layers_: to if not 0 <= layer_idx <= self.n_layers_:, but I still don't know the cause of this error and whether this fix is corret.

    needtriage 
    opened by IncubatorShokuhou 10
  • [Question] use custom estimator to tackle imbalanced datasets

    [Question] use custom estimator to tackle imbalanced datasets

    Hi All,

    As I expressed previously in another post, I want to express my gratitude for your amazing research. I am delighted you found time in your library to deal with custom estimators. However, I am having difficulty with the following:

    Assume I develop the following implementation (using imblearn) and obtain an AUROC score of 0.62:

    model = BalancedRandomForestClassifier(random_state=global_seed_random_state,
                                           class_weight="balanced_subsample",
                                           n_jobs=-1,
                                           replacement=True,
                                           )
    
    model.fit(X_train, y_train)
    y_pred = model.predict(X_test)
    show_output(model, X_test, y_test, y_pred)
    
    
    Classification_report:
                  precision    recall  f1-score   support
    
               0       0.97      0.61      0.75       499
               1       0.08      0.63      0.14        27
    
        accuracy                           0.61       526
       macro avg       0.52      0.62      0.45       526
    weighted avg       0.92      0.61      0.72       526
    
    ROC AUC Score:
    0.6204260372597047
    

    According to the reviews I've been reading regarding your original paper, if we have good results with RF and other similar classifiers, it is worthwhile to attempt Deep Forest and as base learner the one that worked well. However, I attempted to use the custom estimators via the following implementation:

    model = CascadeForestClassifier(
        random_state=global_seed_random_state,
    )
    
    main_estimators = [BalancedRandomForestClassifier(
        class_weight="balanced_subsample",
        n_jobs=-1,
        replacement=True,
        random_state=global_seed_random_state,
    ) for _ in range(2)]
    
    
    diverse_estimators = [BalancedRandomForestClassifier(
        class_weight="balanced_subsample",
        n_jobs=-1,
        replacement=True,
        random_state=global_seed_random_state,
    ) for _ in range(2)]
    
    estimators = main_estimators + diverse_estimators
    
    # layer
    model.set_estimator(estimators)
    

    The findings, however, are 10% less impressive, with an AUROC of 0.555. Note: Above, diverse estimators appears because I attempted to add ExtraTrees or XGBoost instead of a second set of BalancedRandomForestClassifiers. Could you please attempt to direct me in the proper direction? What did I do incorrectly? From your perspective, what type of diversified classifier should I use? Note 2: An AUROC above of 0.6 is quite promising for my current application.

    Thank you very much for your help in advanced. Great day,

    opened by simonprovost 8
  • Survival models

    Survival models

    Hi maintainer,

    I am wondering is that possible to cascade random survival forest (maybe a sksurv model) instead of RF in your deep forest model? As in #48, it seems that the supported model types are classification and regression. (or did I miss some parts of those tutorial docs?)

    Thanks.

    feature request 
    opened by yunwezhang 8
  • [ENH] Support customized base estimator and predictor

    [ENH] Support customized base estimator and predictor

    resolves #29 #26

    Steps

    • [x] Implement K-Fold wrapper for base estimators
    • [x] Implement customized cascade layer
    • [x] Implement set_estimator and set_predictor for the model
    • [x] Add unit tests
    • [x] Add backward compatibility
    • [x] Add documentation and working examples

    Code Snippet

    from deepforest import CascadeForestClassifier
    
    model = CascadeForestClassifier()
    
    # New Steps
    estimator_1, estimator_2 = your_estimator(), your_estimator()
    model.set_estimator(estimator=[estimator_1, estimator_2],  # a list of your base estimators
                        n_splits=5,  # the number of folds
                        oob_approx=False,  # whether to use out-of-bag approximation
                        random_state=None)  # random state used for base estimators
    
    model.set_predictor(predictor=your_predictor)  # an instantiated object of your predictor
    
    model.fit(X_train, y_train)
    y_pred = model.predict(X_test)
    
    feature request 
    opened by xuyxu 8
  • Label encoder for the case where y is 1-D.

    Label encoder for the case where y is 1-D.

    Resolved issue #13

    This is a very naive label encoder implemented with sklearn.preprocessing.LabelEncoder

    • [x] single output (1-D) partial mode
    • [x] single output (1-D) full mode
    • [x] unit test
    opened by NiMaZi 8
  • can't install package use conda env

    can't install package use conda env

    ERROR: Could not find a version that satisfies the requirement deep-forest (from versions: none) ERROR: No matching distribution found for deep-forest

    system: mac python version: 3.8.5 pip version: 20.2.4

    opened by morestart 7
  • Buffer dtype mismatch

    Buffer dtype mismatch

    调用数据集训练出现错误: File "deepforest/_cutils.pyx", line 59, in deepforest._cutils._map_to_bins File "deepforest/_cutils.pyx", line 76, in deepforest._cutils._map_to_bins ValueError: Buffer dtype mismatch, expected 'const X_DTYPE_C' but got 'long'

    bug 
    opened by Mr-memorandum 7
  • pip install deep-forest didn't work in wsl2

    pip install deep-forest didn't work in wsl2

    i was trying to install the package using wsl2. But the terminal raises an error:

    ERROR: Could not find a version that satisfies the requirement deep-forest (from versions: none) ERROR: No matching distribution found for deep-forest

    i don't find any related articles or even stackoverflow post to solve this, please help me.

    opened by romfahrury 4
  • How to apply shap model to DF model to interpret features?

    How to apply shap model to DF model to interpret features?

    How to apply shap model to DF model to interpret features? I tried to apply it directly, but suggested that the model was not in the SHAP package. https://github.com/slundberg/shap. image I suggest that the author improve the interpretability of the DF model,thanks.

    opened by Leopoldxxx 2
  • importing error

    importing error

    Got this erroe with importing

    ImportError Traceback (most recent call last) Input In [59], in <cell line: 24>() 22 import time 23 import io ---> 24 from deepforest import CascadeForestRegressor 25 import joblib 26 from sklearn.utils.fixes import joblib

    File ~\anaconda3\lib\site-packages\deepforest_init_.py:1, in ----> 1 from .cascade import CascadeForestClassifier, CascadeForestRegressor 2 from .forest import RandomForestClassifier, RandomForestRegressor 3 from .forest import ExtraTreesClassifier, ExtraTreesRegressor

    File ~\anaconda3\lib\site-packages\deepforest\cascade.py:17, in 15 from . import _utils 16 from . import _io ---> 17 from ._layer import ( 18 ClassificationCascadeLayer, 19 RegressionCascadeLayer, 20 CustomCascadeLayer, 21 ) 22 from ._binner import Binner 25 def _get_predictor_kwargs(predictor_kwargs, **kwargs) -> dict:

    File ~\anaconda3\lib\site-packages\deepforest_layer.py:17, in 14 from sklearn.base import BaseEstimator, ClassifierMixin, RegressorMixin 16 from . import _utils ---> 17 from ._estimator import Estimator 18 from .utils.kfoldwrapper import KFoldWrapper 21 def _build_estimator( 22 X, 23 y, (...) 32 sample_weight=None, 33 ):

    File ~\anaconda3\lib\site-packages\deepforest_estimator.py:7, in 4 all = ["Estimator"] 6 import numpy as np ----> 7 from .forest import ( 8 RandomForestClassifier, 9 ExtraTreesClassifier, 10 RandomForestRegressor, 11 ExtraTreesRegressor, 12 ) 13 from sklearn.ensemble import ( 14 RandomForestClassifier as sklearn_RandomForestClassifier, 15 ExtraTreesClassifier as sklearn_ExtraTreesClassifier, 16 RandomForestRegressor as sklearn_RandomForestRegressor, 17 ExtraTreesRegressor as sklearn_ExtraTreesRegressor, 18 ) 21 def make_classifier_estimator( 22 name, 23 criterion, (...) 30 ): 31 # RandomForestClassifier

    File ~\anaconda3\lib\site-packages\deepforest\forest.py:34, in 32 from sklearn.utils import check_random_state, compute_sample_weight 33 from sklearn.exceptions import DataConversionWarning ---> 34 from sklearn.utils.fixes import _joblib_parallel_args 35 from sklearn.utils.validation import check_is_fitted, _check_sample_weight 36 from sklearn.utils.validation import _deprecate_positional_args

    ImportError: cannot import name '_joblib_parallel_args' from 'sklearn.utils.fixes' (C:\Users\Mohammad\anaconda3\lib\site-packages\sklearn\utils\fixes.py)

    scikit-learn was upgraded joblib was upgraded still got error

    opened by MohammadSoltani100 2
  • [BUG] cannot correctly clone `CascadeForestRegressor` with `sklearn.base.clone` when using customized estimators

    [BUG] cannot correctly clone `CascadeForestRegressor` with `sklearn.base.clone` when using customized estimators

    Describe the bug cannot correctly clone CascadeForestClassifier/CascadeForestRegressor object with sklearn.base.clone when using customized stimators

    To Reproduce

    from sklearn.datasets import load_boston
    from sklearn.model_selection import train_test_split
    from sklearn.metrics import mean_squared_error
    from sklearn.base import clone
    from deepforest import CascadeForestRegressor
    import xgboost as xgb
    import lightgbm as lgb
    
    X, y = load_boston(return_X_y=True)
    X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=1)
    model = CascadeForestRegressor(random_state=1)
    
    # set estimator
    n_estimators = 4  # the number of base estimators per cascade layer
    estimators = [lgb.LGBMRegressor(random_state=i)  for i in range(n_estimators)]
    model.set_estimator(estimators)
    
    # set predictor 
    predictor = xgb.XGBRegressor()
    model.set_predictor(predictor)
    
    # clone model
    model_new = clone(model)
    
    # try to fit
    model.fit(X_train, y_train)
    

    Expected behavior No error

    Additional context

    ~/miniconda3/envs/pycaret/lib/python3.8/site-packages/deep_forest-0.1.5-py3.8-linux-x86_64.egg/deepforest/cascade.py in fit(self, X, y, sample_weight)
       1004                 if not hasattr(self, "predictor_"):
       1005                     msg = "Missing predictor after calling `set_predictor`"
    -> 1006                     raise RuntimeError(msg)
       1007 
       1008             binner_ = Binner(
    
    RuntimeError: Missing predictor after calling `set_predictor`
    

    This bug occours because when the model is cloned, if the model has customized predictor or estimators, predictor='custom' will be cloned, while self.predictor_ / self.dummy_estimators will not be correctly cloned, which introduced the bug described above.

    I think this bug can be easily fixed by putting the predictor and the list of estimators into the parameter of CascadeForestClassifier/CascadeForestRegressor, just like the way of those meta estimators (e.g. ngboost), but maybe the corresponding APIs will have to be changed.

    For example, the API parameters could be:

    model = CascadeForestRegressor(
        estimators=[lgb.LGBMRegressor(random_state=i) for i in range(n_estimators)],
        predictor=xgb.XGBRegressor(),
    )
    
    needtriage 
    opened by IncubatorShokuhou 1
  • take() got an unexpected keyword argument 'axis'

    take() got an unexpected keyword argument 'axis'

    Got error with code: from sklearn.datasets import load_digits from sklearn.model_selection import train_test_split from sklearn.metrics import accuracy_score

    from deepforest import CascadeForestClassifier

    model = CascadeForestClassifier(random_state=1) model.fit(X_train, y_train)


    TypeError Traceback (most recent call last) in 6 7 model = CascadeForestClassifier(random_state=1) ----> 8 model.fit(X_train, y_train.values.ravel())

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/cascade.py in fit(self, X, y, sample_weight) 1395 y = self._encode_class_labels(y) 1396 -> 1397 super().fit(X, y, sample_weight) 1398 1399 def predict_proba(self, X):

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/cascade.py in fit(self, X, y, sample_weight) 754 755 # Bin the training data --> 756 X_train_ = self.bin_data(binner, X, is_training_data=True) 757 X_train_ = self.buffer_.cache_data(0, X_train_, is_training_data=True) 758

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/cascade.py in _bin_data(self, binner, X, is_training_data) 665 tic = time.time() 666 if is_training_data: --> 667 X_binned = binner.fit_transform(X) 668 else: 669 X_binned = binner.transform(X)

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/sklearn/base.py in fit_transform(self, X, y, **fit_params) 697 if y is None: 698 # fit method of arity 1 (unsupervised transformation) --> 699 return self.fit(X, **fit_params).transform(X) 700 else: 701 # fit method of arity 2 (supervised transformation)

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/_binner.py in fit(self, X) 128 self.validate_params() 129 --> 130 self.bin_thresholds = _find_binning_thresholds( 131 X, 132 self.n_bins - 1,

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/_binner.py in _find_binning_thresholds(X, n_bins, bin_subsample, bin_type, random_state) 75 if n_samples > bin_subsample: 76 subset = rng.choice(np.arange(n_samples), bin_subsample, replace=False) ---> 77 X = X.take(subset, axis=0) 78 79 binning_thresholds = []

    TypeError: take() got an unexpected keyword argument 'axis'

    Dataset is loaded with vaex, is this a problem particular for vaex?

    enhancement 
    opened by JiaLeXian 5
Releases(v0.1.7)
  • v0.1.7(Oct 1, 2022)

  • v0.1.6(Sep 17, 2022)

  • v0.1.5(Apr 16, 2021)

  • v0.1.4(Mar 11, 2021)

    Added

    • Add support on customized estimators (#48) @xuyxu
    • Add official support for ManyLinux-aarch64 (#47) @xuyxu

    Fixed

    • Fix the prediction workflow with only one cascade layer (#56) @xuyxu
    • Fix inconsistency on predictor name (#52) @xuyxu
    • Fix accepted types of target for CascadeForestRegressor (#44) @xuyxu

    Improved

    • Improve target checks for CascadeForestRegressor (#53) @chendingyan
    Source code(tar.gz)
    Source code(zip)
  • v0.1.3(Feb 22, 2021)

    Added

    • Add multi-output support for CascadeForestRegressor (#40) @Alex-Medium
    • Add layer-wise feature importances (#39) @xuyxu
    • Add scikit-learn backend (#36) @xuyxu
    • Add official support for Mac-OS (#34) @T-Allen-sudo
    • Add support on configurable criterion (#28) @tczhao
    Source code(tar.gz)
    Source code(zip)
  • v0.1.2(Feb 11, 2021)

  • v0.1.1(Feb 7, 2021)

    Added

    • Implement the get_forest() method for efficient indexing (#22) @xuyxu
    • Support class label encoding (#18) @NiMaZi
    • Support sample weight in fit() (#7) @tczhao
    • Add configurable predictor parameter (#9) @tczhao
    • Add base class BaseEstimator and ClassifierMixin (#8) @pjgao

    Fixed

    • Fix accepted data types on the binner (#23) @xuyxu
    Source code(tar.gz)
    Source code(zip)
Owner
LAMDA Group, Nanjing University
LAMDA is affiliated with the National Key Laboratory for Novel Software Technology and the Department of Computer Science & Technology, Nanjing University.
LAMDA Group, Nanjing University
This codebase proposes modular light python and pytorch implementations of several LiDAR Odometry methods

pyLiDAR-SLAM This codebase proposes modular light python and pytorch implementations of several LiDAR Odometry methods, which can easily be evaluated

Kitware, Inc. 208 Dec 16, 2022
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition

Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition

107 Dec 02, 2022
Generating Radiology Reports via Memory-driven Transformer

R2Gen This is the implementation of Generating Radiology Reports via Memory-driven Transformer at EMNLP-2020. Citations If you use or extend our work,

CUHK-SZ NLP Group 101 Dec 13, 2022
Codebase for the solution that won first place and was awarded the most human-like agent in the 2021 NeurIPS Competition MineRL BASALT Challenge.

KAIROS MineRL BASALT Codebase for the solution that won first place and was awarded the most human-like agent in the 2021 NeurIPS Competition MineRL B

Vinicius G. Goecks 37 Oct 30, 2022
Code for "Long-tailed Distribution Adaptation"

Long-tailed Distribution Adaptation (Accepted in ACM MM2021) This project is built upon BBN. Installation pip install -r requirements.txt Usage Traini

Zhiliang Peng 10 May 18, 2022
Defense-GAN: Protecting Classifiers Against Adversarial Attacks Using Generative Models (published in ICLR2018)

Defense-GAN: Protecting Classifiers Against Adversarial Attacks Using Generative Models Pouya Samangouei*, Maya Kabkab*, Rama Chellappa [*: authors co

Maya Kabkab 212 Dec 07, 2022
This repository contains several image-to-image translation models, whcih were tested for RGB to NIR image generation. The models are Pix2Pix, Pix2PixHD, CycleGAN and PointWise.

RGB2NIR_Experimental This repository contains several image-to-image translation models, whcih were tested for RGB to NIR image generation. The models

5 Jan 04, 2023
Code for C2-Matching (CVPR2021). Paper: Robust Reference-based Super-Resolution via C2-Matching.

C2-Matching (CVPR2021) This repository contains the implementation of the following paper: Robust Reference-based Super-Resolution via C2-Matching Yum

Yuming Jiang 151 Dec 26, 2022
Imaginaire - NVIDIA's Deep Imagination Team's PyTorch Library

Imaginaire Docs | License | Installation | Model Zoo Imaginaire is a pytorch library that contains optimized implementation of several image and video

NVIDIA Research Projects 3.6k Dec 29, 2022
Mscp jamf - Build compliance in jamf

mscp_jamf Build compliance in Jamf. This will build the following xml pieces to

Bob Gendler 3 Jul 25, 2022
Models Supported: AlbUNet [18, 34, 50, 101, 152] (1D and 2D versions for Single and Multiclass Segmentation, Feature Extraction with supports for Deep Supervision and Guided Attention)

AlbUNet-1D-2D-Tensorflow-Keras This repository contains 1D and 2D Signal Segmentation Model Builder for AlbUNet and several of its variants developed

Sakib Mahmud 1 Nov 15, 2021
Selective Wavelet Attention Learning for Single Image Deraining

SWAL Code for Paper "Selective Wavelet Attention Learning for Single Image Deraining" Prerequisites Python 3 PyTorch Models We provide the models trai

Bobo 9 Jun 17, 2022
Segmentation in Style: Unsupervised Semantic Image Segmentation with Stylegan and CLIP

Segmentation in Style: Unsupervised Semantic Image Segmentation with Stylegan and CLIP Abstract: We introduce a method that allows to automatically se

Daniil Pakhomov 134 Dec 19, 2022
Credo AI Lens is a comprehensive assessment framework for AI systems. Lens standardizes model and data assessment, and acts as a central gateway to assessments created in the open source community.

Lens by Credo AI - Responsible AI Assessment Framework Lens is a comprehensive assessment framework for AI systems. Lens standardizes model and data a

Credo AI 27 Dec 14, 2022
ESPNet: Efficient Spatial Pyramid of Dilated Convolutions for Semantic Segmentation

ESPNet: Efficient Spatial Pyramid of Dilated Convolutions for Semantic Segmentation This repository contains the source code of our paper, ESPNet (acc

Sachin Mehta 515 Dec 13, 2022
A program to recognize fruits on pictures or videos using yolov5

Yolov5 Fruits Detector Requirements Either Linux or Windows. We recommend Linux for better performance. Python 3.6+ and PyTorch 1.7+. Installation To

Fateme Zamanian 30 Jan 06, 2023
Unified file system operation experience for different backend

megfile - Megvii FILE library Docs: http://megvii-research.github.io/megfile megfile provides a silky operation experience with different backends (cu

MEGVII Research 76 Dec 14, 2022
COCO Style Dataset Generator GUI

A simple GUI-based COCO-style JSON Polygon masks' annotation tool to facilitate quick and efficient crowd-sourced generation of annotation masks and bounding boxes. Optionally, one could choose to us

Hans Krupakar 142 Dec 09, 2022
[NeurIPS 2021] SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning

SSUL - Official Pytorch Implementation (NeurIPS 2021) SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning Sun

Clova AI Research 44 Dec 27, 2022
SCNet: Learning Semantic Correspondence

SCNet Code Region matching code is contributed by Kai Han ([email protected]). Dense

Kai Han 34 Sep 06, 2022