Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
76de413
Bump torch from 2.5.1 to 2.7.0
dependabot[bot] May 30, 2025
f626f6d
Bump requests from 2.32.3 to 2.32.4
dependabot[bot] Jun 10, 2025
187c9a9
Bump urllib3 from 2.3.0 to 2.5.0
dependabot[bot] Jun 19, 2025
34242ed
Merge pull request #273 from OpenTabular/dependabot/pip/urllib3-2.5.0
ChrisW09 Jul 2, 2025
9adbd77
Merge pull request #272 from OpenTabular/dependabot/pip/requests-2.32.4
ChrisW09 Jul 2, 2025
c8d9d6d
Merge pull request #270 from OpenTabular/dependabot/pip/torch-2.7.0
ChrisW09 Jul 2, 2025
f0b8a3e
Update pyproject.toml
ChrisW09 Jul 2, 2025
e0eb1c8
Merge pull request #274 from OpenTabular/ChrisW09-patch-1
ChrisW09 Jul 2, 2025
8612e41
Bump aiohttp from 3.11.13 to 3.12.14
dependabot[bot] Jul 15, 2025
a288d8d
Update dataset.py
MaxSchambach Jul 26, 2025
7ade784
Update dataset.py
MaxSchambach Jul 26, 2025
4d2275c
Update dataset.py
MaxSchambach Jul 26, 2025
9e8043f
Update dataset.py
MaxSchambach Jul 26, 2025
07c16a1
Merge pull request #276 from OpenTabular/dependabot/pip/aiohttp-3.12.14
ChrisW09 Aug 6, 2025
f5407d8
Merge pull request #278 from MaxSchambach/fix-dataset
ChrisW09 Aug 6, 2025
34b8273
update name in readme
ChrisW09 Nov 11, 2025
d41819c
update name
ChrisW09 Nov 11, 2025
dc1d90d
Update package naming from Mambular to DeepTabular in docs/homepage.md
ChrisW09 Nov 11, 2025
88c3673
Bump version to 1.6.0 and update package name to deeptabular
ChrisW09 Nov 11, 2025
9a36018
Update all remaining references from mambular to deeptabular
ChrisW09 Nov 11, 2025
4f3b7a6
Fix remaining mambular references in docstrings and tests
ChrisW09 Nov 11, 2025
dc9058c
Fix import in enode_utils.py from mambular to deeptabular
ChrisW09 Nov 11, 2025
4ffe6c8
Update Python version constraint and numpy dependency for Python 3.13…
ChrisW09 Nov 11, 2025
8e8561a
Merge branch 'develop' into feature/rename
mkumar73 Nov 12, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. Ubuntu]
- Python version [e.g. 3.8]
- Mambular Version [e.g. 0.1.2]
- DeepTabular Version [e.g. 1.6.0]

**Additional context**
Add any other context about the problem here.
8 changes: 7 additions & 1 deletion .github/workflows/pr-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,13 @@ jobs:
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
poetry install --with dev
poetry install
pip install pytest

- name: Install Package Locally
run: |
poetry build
pip install dist/*.whl # Install the built package to fix "No module named 'deeptabular'"

- name: Run Unit Tests
env:
Expand Down
68 changes: 34 additions & 34 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,24 +2,24 @@
<img src="./docs/images/logo/mamba_tabular.jpg" width="400"/>


[![PyPI](https://img.shields.io/pypi/v/mambular)](https://pypi.org/project/mambular)
![PyPI - Downloads](https://img.shields.io/pypi/dm/mambular)
[![docs build](https://readthedocs.org/projects/mambular/badge/?version=latest)](https://mambular.readthedocs.io/en/latest/?badge=latest)
[![docs](https://img.shields.io/badge/docs-latest-blue)](https://mambular.readthedocs.io/en/latest/)
[![open issues](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/basf/mamba-tabular/issues)
[![PyPI](https://img.shields.io/pypi/v/deeptabular)](https://pypi.org/project/deeptabular)
![PyPI - Downloads](https://img.shields.io/pypi/dm/deeptabular)
[![docs build](https://readthedocs.org/projects/deeptabular/badge/?version=latest)](https://deeptabular.readthedocs.io/en/latest/?badge=latest)
[![docs](https://img.shields.io/badge/docs-latest-blue)](https://deeptabular.readthedocs.io/en/latest/)
[![open issues](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/OpenTabular/DeepTabular/issues)


[📘Documentation](https://mambular.readthedocs.io/en/latest/index.html) |
[🛠️Installation](https://mambular.readthedocs.io/en/latest/installation.html) |
[Models](https://mambular.readthedocs.io/en/latest/api/models/index.html) |
[🤔Report Issues](https://github.com/basf/mamba-tabular/issues)
[📘Documentation](https://deeptabular.readthedocs.io/en/latest/index.html) |
[🛠️Installation](https://deeptabular.readthedocs.io/en/latest/installation.html) |
[Models](https://deeptabular.readthedocs.io/en/latest/api/models/index.html) |
[🤔Report Issues](https://github.com/OpenTabular/DeepTabular/issues)
</div>

<div style="text-align: center;">
<h1>Mambular: Tabular Deep Learning Made Simple</h1>
<h1>DeepTabular: Tabular Deep Learning Made Simple</h1>
</div>

Mambular is a Python library for tabular deep learning. It includes models that leverage the Mamba (State Space Model) architecture, as well as other popular models like TabTransformer, FTTransformer, TabM and tabular ResNets. Check out our paper `Mambular: A Sequential Model for Tabular Deep Learning`, available [here](https://arxiv.org/abs/2408.06291). Also check out our paper introducing [TabulaRNN](https://arxiv.org/pdf/2411.17207) and analyzing the efficiency of NLP inspired tabular models.
DeepTabular is a Python library for tabular deep learning. It includes models that leverage the Mamba (State Space Model) architecture, as well as other popular models like TabTransformer, FTTransformer, TabM and tabular ResNets. Check out our paper `Mambular: A Sequential Model for Tabular Deep Learning`, available [here](https://arxiv.org/abs/2408.06291). Also check out our paper introducing [TabulaRNN](https://arxiv.org/pdf/2411.17207) and analyzing the efficiency of NLP inspired tabular models.

<h3>⚡ What's New ⚡</h3>
<ul>
Expand Down Expand Up @@ -48,10 +48,10 @@ Mambular is a Python library for tabular deep learning. It includes models that


# 🏃 Quickstart
Similar to any sklearn model, Mambular models can be fit as easy as this:
Similar to any sklearn model, DeepTabular models can be fit as easy as this:

```python
from mambular.models import MambularClassifier
from deeptabular.models import MambularClassifier
# Initialize and fit your model
model = MambularClassifier()

Expand All @@ -60,7 +60,7 @@ model.fit(X, y, max_epochs=150, lr=1e-04)
```

# 📖 Introduction
Mambular is a Python package that brings the power of advanced deep learning architectures to tabular data, offering a suite of models for regression, classification, and distributional regression tasks. Designed with ease of use in mind, Mambular models adhere to scikit-learn's `BaseEstimator` interface, making them highly compatible with the familiar scikit-learn ecosystem. This means you can fit, predict, and evaluate using Mambular models just as you would with any traditional scikit-learn model, but with the added performance and flexibility of deep learning.
DeepTabular is a Python package that brings the power of advanced deep learning architectures to tabular data, offering a suite of models for regression, classification, and distributional regression tasks. Designed with ease of use in mind, DeepTabular models adhere to scikit-learn's `BaseEstimator` interface, making them highly compatible with the familiar scikit-learn ecosystem. This means you can fit, predict, and evaluate using DeepTabular models just as you would with any traditional scikit-learn model, but with the added performance and flexibility of deep learning.


# 🤖 Models
Expand Down Expand Up @@ -94,13 +94,13 @@ Hence, they are available as e.g. `MambularRegressor`, `MambularClassifier` or `

# 📚 Documentation

You can find the Mamba-Tabular API documentation [here](https://mambular.readthedocs.io/en/latest/).
You can find the DeepTabular API documentation [here](https://deeptabular.readthedocs.io/en/latest/).

# 🛠️ Installation

Install Mambular using pip:
Install DeepTabular using pip:
```sh
pip install mambular
pip install deeptabular
```

If you want to use the original mamba and mamba2 implementations, additionally install mamba-ssm via:
Expand All @@ -120,7 +120,7 @@ pip install mamba-ssm

<h2> Preprocessing </h2>

Mambular uses pretab preprocessing: https://github.com/OpenTabular/PreTab
DeepTabular uses pretab preprocessing: https://github.com/OpenTabular/PreTab

Hence, datatypes etc. are detected automatically and all preprocessing methods from pretab as well as from Sklearn.preprocessing are available.
Additionally, you can specify that each feature is preprocessed differently, according to your requirements, by setting the `feature_preprocessing={}`argument during model initialization.
Expand All @@ -144,10 +144,10 @@ For an overview over all available methods: [pretab](https://github.com/OpenTabu


<h2> Fit a Model </h2>
Fitting a model in mambular is as simple as it gets. All models in mambular are sklearn BaseEstimators. Thus the `.fit` method is implemented for all of them. Additionally, this allows for using all other sklearn inherent methods such as their built in hyperparameter optimization tools.
Fitting a model in deeptabular is as simple as it gets. All models in deeptabular are sklearn BaseEstimators. Thus the `.fit` method is implemented for all of them. Additionally, this allows for using all other sklearn inherent methods such as their built in hyperparameter optimization tools.

```python
from mambular.models import MambularClassifier
from deeptabular.models import MambularClassifier
# Initialize and fit your model
model = MambularClassifier(
d_model=64,
Expand Down Expand Up @@ -243,12 +243,12 @@ Or use the built-in bayesian hpo simply by running:
best_params = model.optimize_hparams(X, y)
```

This automatically sets the search space based on the default config from ``mambular.configs``. See the documentation for all params with regard to ``optimize_hparams()``. However, the preprocessor arguments are fixed and cannot be optimized here.
This automatically sets the search space based on the default config from ``deeptabular.configs``. See the documentation for all params with regard to ``optimize_hparams()``. However, the preprocessor arguments are fixed and cannot be optimized here.


<h2> ⚖️ Distributional Regression with MambularLSS </h2>

MambularLSS allows you to model the full distribution of a response variable, not just its mean. This is crucial when understanding variability, skewness, or kurtosis is important. All Mambular models are available as distributional models.
MambularLSS allows you to model the full distribution of a response variable, not just its mean. This is crucial when understanding variability, skewness, or kurtosis is important. All DeepTabular models are available as distributional models.

<h3> Key Features of MambularLSS: </h3>

Expand Down Expand Up @@ -277,10 +277,10 @@ These distribution classes make MambularLSS versatile in modeling various data t

<h3> Getting Started with MambularLSS: </h3>

To integrate distributional regression into your workflow with `MambularLSS`, start by initializing the model with your desired configuration, similar to other Mambular models:
To integrate distributional regression into your workflow with `MambularLSS`, start by initializing the model with your desired configuration, similar to other DeepTabular models:

```python
from mambular.models import MambularLSS
from deeptabular.models import MambularLSS

# Initialize the MambularLSS model
model = MambularLSS(
Expand All @@ -305,18 +305,18 @@ model.fit(

# 💻 Implement Your Own Model

Mambular allows users to easily integrate their custom models into the existing logic. This process is designed to be straightforward, making it simple to create a PyTorch model and define its forward pass. Instead of inheriting from `nn.Module`, you inherit from Mambular's `BaseModel`. Each Mambular model takes three main arguments: the number of classes (e.g., 1 for regression or 2 for binary classification), `cat_feature_info`, and `num_feature_info` for categorical and numerical feature information, respectively. Additionally, you can provide a config argument, which can either be a custom configuration or one of the provided default configs.
DeepTabular allows users to easily integrate their custom models into the existing logic. This process is designed to be straightforward, making it simple to create a PyTorch model and define its forward pass. Instead of inheriting from `nn.Module`, you inherit from DeepTabular's `BaseModel`. Each DeepTabular model takes three main arguments: the number of classes (e.g., 1 for regression or 2 for binary classification), `cat_feature_info`, and `num_feature_info` for categorical and numerical feature information, respectively. Additionally, you can provide a config argument, which can either be a custom configuration or one of the provided default configs.

One of the key advantages of using Mambular is that the inputs to the forward passes are lists of tensors. While this might be unconventional, it is highly beneficial for models that treat different data types differently. For example, the TabTransformer model leverages this feature to handle categorical and numerical data separately, applying different transformations and processing steps to each type of data.
One of the key advantages of using DeepTabular is that the inputs to the forward passes are lists of tensors. While this might be unconventional, it is highly beneficial for models that treat different data types differently. For example, the TabTransformer model leverages this feature to handle categorical and numerical data separately, applying different transformations and processing steps to each type of data.

Here's how you can implement a custom model with Mambular:
Here's how you can implement a custom model with DeepTabular:

1. **First, define your config:**
The configuration class allows you to specify hyperparameters and other settings for your model. This can be done using a simple dataclass.

```python
from dataclasses import dataclass
from mambular.configs import BaseConfig
from deeptabular.configs import BaseConfig

@dataclass
class MyConfig(BaseConfig):
Expand All @@ -332,8 +332,8 @@ Here's how you can implement a custom model with Mambular:
Define your custom model just as you would for an `nn.Module`. The main difference is that you will inherit from `BaseModel` and use the provided feature information to construct your layers. To integrate your model into the existing API, you only need to define the architecture and the forward pass.

```python
from mambular.base_models.utils import BaseModel
from mambular.utils.get_feature_dimensions import get_feature_dimensions
from deeptabular.base_models.utils import BaseModel
from deeptabular.utils.get_feature_dimensions import get_feature_dimensions
import torch
import torch.nn

Expand Down Expand Up @@ -372,19 +372,19 @@ Here's how you can implement a custom model with Mambular:
return output
```

3. **Leverage the Mambular API:**
You can build a regression, classification, or distributional regression model that can leverage all of Mambular's built-in methods by using the following:
3. **Leverage the DeepTabular API:**
You can build a regression, classification, or distributional regression model that can leverage all of DeepTabular's built-in methods by using the following:

```python
from mambular.models.utils import SklearnBaseRegressor
from deeptabular.models.utils import SklearnBaseRegressor

class MyRegressor(SklearnBaseRegressor):
def __init__(self, **kwargs):
super().__init__(model=MyCustomModel, config=MyConfig, **kwargs)
```

4. **Train and evaluate your model:**
You can now fit, evaluate, and predict with your custom model just like with any other Mambular model. For classification or distributional regression, inherit from `SklearnBaseClassifier` or `SklearnBaseLSS` respectively.
You can now fit, evaluate, and predict with your custom model just like with any other DeepTabular model. For classification or distributional regression, inherit from `SklearnBaseClassifier` or `SklearnBaseLSS` respectively.

```python
regressor = MyRegressor(numerical_preprocessing="ple")
Expand Down
File renamed without changes.
2 changes: 1 addition & 1 deletion mambular/__version__.py → deeptabular/__version__.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,5 +17,5 @@

# The following line *must* be the last in the module, exactly as formatted:

__version__ = "1.5.0"
__version__ = "1.6.0"

File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import torch
import torch.nn as nn
import torch.nn.functional as F
from mambular.arch_utils.layer_utils.sparsemax import sparsemax, sparsemoid
from deeptabular.arch_utils.layer_utils.sparsemax import sparsemax, sparsemoid
from .data_aware_initialization import ModuleWithInit
from .numpy_utils import check_numpy
import numpy as np
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ def __init__(
labels=None,
regression=True,
):
assert cat_features_list or num_features_list

self.cat_features_list = cat_features_list # Categorical features tensors
self.num_features_list = num_features_list # Numerical features tensors
self.embeddings_list = embeddings_list # Embeddings tensors (optional)
Expand All @@ -44,7 +46,8 @@ def __init__(
self.labels = None # No labels in prediction mode

def __len__(self):
return len(self.num_features_list[0]) # Use numerical features length
_feats = self.num_features_list if self.num_features_list else self.cat_features_list
return len(_feats[0])

def __getitem__(self, idx):
"""Retrieves the features and label for a given index.
Expand Down
File renamed without changes.
6 changes: 3 additions & 3 deletions mambular/models/autoint.py → deeptabular/models/autoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class and uses the AutoInt model with the default AutoInt
configuration.
""",
examples="""
>>> from mambular.models import AutoIntRegressor
>>> from deeptabular.models import AutoIntRegressor
>>> model = AutoIntRegressor(d_model=64, n_layers=8)
>>> model.fit(X_train, y_train)
>>> preds = model.predict(X_test)
Expand All @@ -33,7 +33,7 @@ class AutoIntClassifier(SklearnBaseClassifier):
"""AutoInt Classifier. This class extends the SklearnBaseClassifier class
and uses the AutoInt model with the default AutoInt configuration.""",
examples="""
>>> from mambular.models import AutoIntClassifier
>>> from deeptabular.models import AutoIntClassifier
>>> model = AutoIntClassifier(d_model=64, n_layers=8)
>>> model.fit(X_train, y_train)
>>> preds = model.predict(X_test)
Expand All @@ -52,7 +52,7 @@ class AutoIntLSS(SklearnBaseLSS):
This class extends the SklearnBaseLSS class and uses the
AutoInt model with the default AutoInt configuration.""",
examples="""
>>> from mambular.models import AutoIntLSS
>>> from deeptabular.models import AutoIntLSS
>>> model = AutoIntLSS(d_model=64, n_layers=8)
>>> model.fit(X_train, y_train, family="normal")
>>> preds = model.predict(X_test)
Expand Down
6 changes: 3 additions & 3 deletions mambular/models/enode.py → deeptabular/models/enode.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ class ENODERegressor(SklearnBaseRegressor):
with the default ENODE configuration.
""",
examples="""
>>> from mambular.models import ENODERegressor
>>> from deeptabular.models import ENODERegressor
>>> model = ENODERegressor()
>>> model.fit(X_train, y_train)
>>> preds = model.predict(X_test)
Expand All @@ -35,7 +35,7 @@ class ENODEClassifier(SklearnBaseClassifier):
with the default ENODE configuration.
""",
examples="""
>>> from mambular.models import ENODEClassifier
>>> from deeptabular.models import ENODEClassifier
>>> model = ENODEClassifier()
>>> model.fit(X_train, y_train)
>>> preds = model.predict(X_test)
Expand All @@ -56,7 +56,7 @@ class ENODELSS(SklearnBaseLSS):
with the default ENODE configuration.
""",
examples="""
>>> from mambular.models import ENODELSS
>>> from deeptabular.models import ENODELSS
>>> model = ENODELSS()
>>> model.fit(X_train, y_train, family='normal')
>>> preds = model.predict(X_test)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class and uses the FTTransformer model with the default FTTransformer
configuration.
""",
examples="""
>>> from mambular.models import FTTransformerRegressor
>>> from deeptabular.models import FTTransformerRegressor
>>> model = FTTransformerRegressor(d_model=64, n_layers=8)
>>> model.fit(X_train, y_train)
>>> preds = model.predict(X_test)
Expand All @@ -35,7 +35,7 @@ class FTTransformerClassifier(SklearnBaseClassifier):
"""FTTransformer Classifier. This class extends the SklearnBaseClassifier class
and uses the FTTransformer model with the default FTTransformer configuration.""",
examples="""
>>> from mambular.models import FTTransformerClassifier
>>> from deeptabular.models import FTTransformerClassifier
>>> model = FTTransformerClassifier(d_model=64, n_layers=8)
>>> model.fit(X_train, y_train)
>>> preds = model.predict(X_test)
Expand All @@ -56,7 +56,7 @@ class FTTransformerLSS(SklearnBaseLSS):
This class extends the SklearnBaseLSS class and uses the
FTTransformer model with the default FTTransformer configuration.""",
examples="""
>>> from mambular.models import FTTransformerLSS
>>> from deeptabular.models import FTTransformerLSS
>>> model = FTTransformerLSS(d_model=64, n_layers=8)
>>> model.fit(X_train, y_train, family="normal")
>>> preds = model.predict(X_test)
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ class MambAttentionRegressor(SklearnBaseRegressor):
with the default MambAttention configuration.
""",
examples="""
>>> from mambular.models import MambAttentionRegressor
>>> from deeptabular.models import MambAttentionRegressor
>>> model = MambAttentionRegressor(d_model=64, n_layers=8)
>>> model.fit(X_train, y_train)
>>> preds = model.predict(X_test)
Expand Down
Loading
Loading