Text using Chatgpt, image from Dall-E, text to speech from D-ID
Denoising Autoencoders for Tabular Data
Financial Explaining Anomalies

- Initial paper :https://arxiv.org/pdf/2209.10658.pdf
- Code: https://github.com/topics/denoising-autoencoders
- Kaggle example : kaggle Notebook
- Bundesbank (2023) use case: Bundesbank (2023) paper
Revisiting Deep Learning Models for Tabular Data

- Paper: https://arxiv.org/pdf/2106.11959v2.pdf
- Code Pytorch: https://github.com/lucidrains/tab-transformer-pytorch
- Library bis: Implementation of TabTransformer in TensorFlow and Keras
- Kaggle example: kaggle tabtransformer
- Notebook: Notebook in keras
- Keras implementation code :Keras Implementation
- Keras code: keras-team code
TabTransformer: Tabular Data Modeling Using Contextual Embeddings
The main idea in the paper is that the performance of regular Multi-layer Perceptron (MLP) can be significantly improved if we use Transformers to transforms regular categorical embeddings into contextual ones.
The TabTransformer is built upon self-attention based Transformers. The Transformer layers transform the embed- dings of categorical features into robust contextual embed- dings to achieve higher prediction accuracy.

Missing data Imputation
EDA and AutoML
Notebook Examples on EDA (Automated Exploratory Data Analysis) and AutoML

- Classification: Classification Notebook
- Regression: Regression Notebook
Deal with Imbalanced data
Python module to perform under sampling and over sampling with various techniques
imbalanced-learn
Library: https://imbalanced-learn.org/