Skip to yearly menu bar Skip to main content


Poster

Time-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting

Qingxiang Liu · Xu Liu · Chenghao Liu · Qingsong Wen · Yuxuan Liang

East Exhibit Hall A-C #4205
[ ] [ Project Page ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Unlike natural language processing and computer vision, the development of Foundation Models (FMs) for time series forecasting is blocked due to data scarcity. While recent efforts are focused on building such FMs by unlocking the potential of language models (LMs) for time series analysis, dedicated parameters for various downstream forecasting tasks need training, which hinders the common knowledge sharing across domains.Moreover, data owners may hesitate to share the access to local data due to privacy concerns and copyright protection, which makes it impossible to simply construct a FM on cross-domain training instances.To address these issues, we propose Time-FFM, a Federated Foundation Model for Time series forecasting by leveraging pretrained LMs.Specifically, we begin by transforming time series into the modality of text tokens.To bootstrap LMs for time series reasoning, we propose a prompt adaption module to determine domain-customized prompts dynamically instead of artificially.Given the data heterogeneity across domains, we design a personalized federated training strategy by learning global encoders and local prediction heads. Our comprehensive experiments indicate that Time-FFM outperforms state-of-the-arts and promises effective few-shot and zero-shot forecaster.The code is available at https://github.com/CityMind-Lab/NeurIPS24-Time-FFM/tree/main.

Live content is unavailable. Log in and register to view live content