Financial Explaining Anomalies
- Paper: https://arxiv.org/pdf/2106.11959v2.pdf
- Code Pytorch: https://github.com/lucidrains/tab-transformer-pytorch
- Library bis: Implementation of TabTransformer in TensorFlow and Keras
- Kaggle example: kaggle tabtransformer
- Notebook: Notebook in keras
- Keras implementation code :Keras Implementation
- Keras code: keras-team code
The main idea in the paper is that the performance of regular Multi-layer Perceptron (MLP) can be significantly improved if we use Transformers to transforms regular categorical embeddings into contextual ones.
The TabTransformer is built upon self-attention based Transformers. The Transformer layers transform the embed- dings of categorical features into robust contextual embed- dings to achieve higher prediction accuracy.
Are deep learning models superior ?