TinyTimeMixers

TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting, open-sourced by IBM Research. With less than 1 Million parameters, TTM introduces the notion of the first-ever “tiny” pre-trained models for Time-Series Forecasting.

https://huggingface.co/ibm/TTM

TTM outperforms several popular benchmarks demanding billions of parameters in zero-shot and few-shot forecasting. TTMs are lightweight forecasters, pre-trained on publicly available time series data with various augmentations. TTM provides state-of-the-art zero-shot forecasts and can easily be fine-tuned for multi-variate forecasts with just 5% of the training data to be competitive. 

Chronos: Learning the Language of Time Series

  • Chronos is a framework designed for pretrained probabilistic time series models.
  • It utilizes scaling and quantization to tokenize time series values into a fixed vocabulary.
  • Chronos trains transformer-based language model architectures (specifically, models from the T5 family with parameters ranging from 20M to 710M) using cross-entropy loss.
  • The models are pretrained on a mix of publicly available datasets and a synthetic dataset generated via Gaussian processes, enhancing generalization.
  • In a comprehensive benchmark involving 42 datasets, including both classical local models and deep learning approaches, Chronos models:
  • (a) significantly outperform other methods on datasets included in the training corpus;
  • (b) show comparable or occasionally superior zero-shot performance on new datasets compared to methods trained specifically on those datasets.
  • These results demonstrate the potential of pretrained models to leverage time series data across various domains for improving zero-shot accuracy on unseen forecasting tasks, suggesting a simplified approach to forecasting pipelines.

https://arxiv.org/pdf/2403.07815.pdf

https://github.com/amazon-science/chronos-forecasting/

Unified Time Series Model

UniTS is a unified time series model that can process various tasks across multiple domains with shared parameters and does not have any task-specific modules.

Foundation models, especially LLMs, are profoundly transforming deep learning. Instead of training many task-specific models, we can adapt a single pretrained model to many tasks via few-shot prompting or fine-tuning. However, current foundation models apply to sequence data but not to time series, which present unique challenges due to the inherent diverse and multi-domain time series datasets, diverging task specifications across forecasting, classification and other types of tasks, and the apparent need for task-specialized models. 

We developed UniTS, a unified time series model that supports a universal task specification, accommodating classification, forecasting, imputation, and anomaly detection tasks. This is achieved through a novel unified network backbone, which incorporates sequence and variable attention along with a dynamic linear operator and is trained as a unified model. 

Across 38 multi-domain datasets, UniTS demonstrates superior performance compared to task-specific models and repurposed natural language-based LLMs. UniTS exhibits remarkable zero-shot, few-shot, and prompt learning capabilities when evaluated on new data domains and tasks. We will release the source code and datasets.

https://arxiv.org/pdf/2403.00131v1.pdf

https://zitniklab.hms.harvard.edu/projects/UniTS/

https://github.com/mims-harvard/UniTS

Unified Training of Universal Time Series Forecasting Transformers

  • Deep learning for time series forecasting traditionally uses a one-model-per-dataset approach, limiting potential advancements.
  • Universal forecasting introduces the idea of pre-training a single Large Time Series Model on a vast collection of datasets for diverse tasks.
  • Challenges in creating such a model include: cross-frequency learning, handling multivariate series with arbitrary variates, and varying distributional properties of large-scale data.
  • To overcome these challenges, novel enhancements to the time series Transformer architecture are introduced, creating the Masked EncOder-based UnIveRsAl TIme Series Forecasting Transformer (MOIRAI).
  • MOIRAI is trained on the Large-scale Open Time Series Archive (LOTSA), which contains over 27 billion observations across nine domains.
  • MOIRAI demonstrates competitive or superior performance as a zero-shot forecaster compared to full-shot models.

https://arxiv.org/pdf/2402.02592.pdf?utm_source=substack&utm_medium=email

Mastering Time Series Forecasting: A Guide to Python’s Most Influential Libraries


The Python ecosystem offers a rich suite of libraries for time series forecasting. Each caters to different needs and comes with its community and popularity, often reflected in the number of GitHub stars. Here’s a rundown of the top libraries, their best use cases, and resources for learning more:

  1. Prophet (Facebook):
  1. pmdarima:
  1. Skforecast:
  1. Greykite (LinkedIn):
  1. Functime:
  1. Arch:

Nixtla’s Suite:

  • StatsForecast:
  • Best for: Rapid computations and high-performance univariate time series forecasting.
  • GitHub Stars: Check Latest
  • Best Article: Nixtla Official Page
  • mlforecast:
  • Best for: Distributed computing environments needing feature engineering at scale.
  • GitHub Stars: Check Latest
  • NeuralForecast:
  • Best for: Leveraging neural networks for time series forecasting, suitable for non-experts.
  • GitHub Stars: Check Latest

Transformers for Time Series:

This curated guide aims to illuminate the path for those exploring the varied landscape of time series forecasting, providing a compass to the tools that resonate most with your project.


Time Series Made Easy in Python: DARTS

Darts is a Python library for user-friendly forecasting and anomaly detection on time series. It contains a variety of models, from classics such as ARIMA to deep neural networks.

Some of the key features of Darts include:

  • A simple and intuitive interface for defining and fitting models
  • Support for different types of time series data, including univariate, multivariate, and panel data
  • A wide range of built-in models, including ARIMA, Exponential Smoothing, Prophet, LSTM, and TCN
  • Tools for hyperparameter tuning and model selection, such as cross-validation and grid search
  • Visualization tools for exploring and analyzing time series data and model outputs

Library

ModelUnivariateMultivariateProbabilisticMultiple series (global)Past-observed covariatesFuture-known covariatesStatic covariatesReference
ARIMA
VARIMA
AutoARIMA
StatsForecastAutoARIMA (faster AutoARIMA)Nixtla’s statsforecast
ExponentialSmoothing
StatsForecastETSNixtla’s statsforecast
BATS and TBATSTBATS paper
Theta and FourThetaTheta & 4 Theta
Prophet (see install notes)Prophet repo
FFT (Fast Fourier Transform)
KalmanForecaster using the Kalman filter and N4SID for system identificationN4SID paper
Croston method
RegressionModel; generic wrapper around any sklearn regression model
RandomForest
LinearRegressionModel
LightGBMModel
CatBoostModel
XGBModel
RNNModel (incl. LSTM and GRU); equivalent to DeepAR in its probabilistic versionDeepAR paper
BlockRNNModel (incl. LSTM and GRU)
NBEATSModelN-BEATS paper
NHiTSModelN-HiTS paper
TCNModelTCN paperDeepTCN paperblog post
TransformerModel
TFTModel (Temporal Fusion Transformer)TFT paperPyTorch Forecasting
DLinearModelDLinear paper
NLinearModelNLinear paper
Naive Baselines