This repo provides official code, datasets, and checkpoints for Timer: Generative Pre-trained Transformers Are Large Time Series Models. [Poster], [Slides].
π© News (2025.5) Sundial, a family of generative time series foundation models has been accepted as ICML 2025 Spotlight (Top 2.6%). Get your first zero-shot predictions in one second! [GitHub], [HuggingFace].
π© News (2025.2) We release an open codebase OpenLTM, which contains a simple pipeline to pre-train customized large time-series models :)
π© News (2024.12) Timer-XL for unified forecasting is accepted as ICLR 2025. We released a pre-trained model on 260B time points [Performance] [Checkpoint] [Quickstart].
π© News (2024.10) We release the pre-training dataset UTSD on HuggingFace or you can use the numpy format UTSD and this dataloader.
π© News (2024.5) Accepted by ICML 2024, a camera-ready version of 31 pages.
π© News (2024.2) Releasing model checkpoints and code for fine-tuning.
Time Series Transformer (Timer) is a Generative Pre-trained Transformer for general time series analysis.
We provide the checkpoint to make predictions without training samples. See our HuggingFace Repo for more information.
Example
import torch
from transformers import AutoModelForCausalLM
# load pretrain model
model = AutoModelForCausalLM.from_pretrained('thuml/timer-base-84m', trust_remote_code=True)
# prepare input
batch_size, lookback_length = 1, 2880
seqs = torch.randn(batch_size, lookback_length)
# generate forecast
prediction_length = 96
normed_output = model.generate(normed_seqs, max_new_tokens=prediction_length)
print(output.shape)
For developers interested in fine-tuning large time-series models or pre-training on customized datasets, please refer to OpenLTM, which includes the implementations and checkpoint of large time-series models.
For developers interested in applying large time-series models on other time series analysis tasks (e.g., imputation and anomaly detection), we provide example scripts here.
We collect Unified Time Series Datasets (UTSD), which encompass well-curated time series to facilitate the research on large time-series models. Our dataset is released in HuggingFace.
You can access the data from HuggingFace and load the data in the style of TSLib:
# huggingface-cli login
# export HF_ENDPOINT=https://hf-mirror.com
python ./scripts/UTSD/download_dataset.py
# dataloader
python ./scripts/UTSD/utsdataset.py
If you meet troubles when accessing the data, you can also download UTSD in numpy from [Tsinghua Cloud] and use UTSD_Npy
dataloader from [OpenLTM].
To pre-train on heterogeneous time series, we propose single-series sequence (S3), reserving series variations into the unified 1D context. Further, we convert forecasting, imputation, and anomaly detection into a unified generative task.
We evaluate various candidate backbones and eventually adopt the decoder-only Transformer, which provides notable generalization performance and flexibility that accommodate varying-length time series.
Timer achieves state-of-the-art performance in zero-shot forecasting and few-shot adaptation.
By scaling, Timer achieves notable performance improvement. Currently, we provide the base version containing 84M parameters that is pre-trained on 260B time points, which supports a maximum context length of 2880.
We proposed Timer-XL for unified time series forecasting. It can be used for task-specific training or scalable pre-training, handling arbitrary-length and any-variable time series [Repo].
We proposed Sundial, a family of generative time series foundation models, which is pre-trained on a trillion (10^12) time points. The model can be applied for point and probabilistic forecasting, making zero-shot predictions.
If you find this repo helpful, please cite our paper.
@inproceedings{liutimer,
title={Timer: Generative Pre-trained Transformers Are Large Time Series Models},
author={Liu, Yong and Zhang, Haoran and Li, Chenyu and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
booktitle={Forty-first International Conference on Machine Learning}
}
@article{liu2024timer,
title={Timer-XL: Long-Context Transformers for Unified Time Series Forecasting},
author={Liu, Yong and Qin, Guo and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
journal={arXiv preprint arXiv:2410.04803},
year={2024}
}
If you have any questions or want to use the code, feel free to contact:
- Yong Liu (liuyong21@mails.tsinghua.edu.cn)
- Guo Qin (qinguo24@mails.tsinghua.edu.cn)
- Haoran Zhang (zhang-hr24@mails.tsinghua.edu.cn)
- Chenyu Li (lichenyu20@mails.tsinghua.edu.cn)