Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on robustness of zero/few-shot learning in foundation models (R0-FoMo)

Lag-Llama: Towards Time-Series Foundation Models

Kashif Rasul · Arjun Ashok · Marin Biloš · Andrew Williams · Arian Khorasani · George Adamopoulos · Rishika Bhagwatkar · Hena Ghonia · Nadhir Hassen · Anderson Schneider · Sahil Garg · Alexandre Drouin · Nicolas Chapados · Yuriy Nevmyvaka · Irina Rish


Abstract:

Aiming to build foundation models for time-series forecasting and study their scaling behavior, we present here our work-in-progress on Lag-Llama, a general-purpose univariate probabilistic time-series forecasting model trained on a large collection of time-series data. The model shows good zero-shot prediction capabilities on unseen “out-of-distribution” time-series datasets, outperforming supervised baselines. We use smoothly broken power-laws to fit and predict model scaling behavior.

Chat is not available.