This is the training code for our paper "A Unified Shape-Aware Foundation Model for Time Series Classification" (AAAI-26).
Foundation models pre-trained on large-scale source datasets are reshaping the traditional training paradigm for time series classification. However, existing time series foundation models primarily focus on forecasting tasks and often overlook classification-specific challenges, such as modeling interpretable shapelets that capture class-discriminative temporal features. To bridge this gap, we propose UniShape, a unified shape-aware foundation model designed for time series classification. UniShape incorporates a shape-aware adapter that adaptively aggregates multiscale discriminative subsequences (shapes) into class tokens, effectively selecting the most relevant subsequence scales to enhance model interpretability. Meanwhile, a prototype-based pretraining module is introduced to jointly learn instance- and shape-level representations, enabling the capture of transferable shape patterns. Pre-trained on a large-scale multi-domain time series dataset comprising 1.89 million samples, UniShape exhibits superior generalization across diverse target domains. Experiments on 128 UCR datasets and 30 additional time series datasets demonstrate that UniShape achieves state-of-the-art classification performance, with interpretability and ablation analyses further validating its effectiveness.
The large-scale pretraining dataset is constructed from the UCR archive, UEA archive, and eight domain-diverse datasets (TFC-8).
Download Links:
- UCR Archive (128 datasets): https://www.cs.ucr.edu/~eamonn/time_series_data_2018/
- UEA Archive (30 datasets): http://www.timeseriesclassification.com/dataset.php
- TFC-8: Google Drive
You can generate the merged 1.89M-sample pretraining dataset using
datapre/pretrain_dataset_pre.py, or download it directly:
- Pretraining Dataset (1.89M samples): Google Drive
For evaluation, UniShape is tested on 128 UCR datasets and 30 additional TSC datasets, following “Bake Off Redux: A Review and Experimental Evaluation of Recent Time Series Classification Algorithms.”
Download Links:
- UCR Archive (128 datasets): https://www.cs.ucr.edu/~eamonn/time_series_data_2018/
- 30 Additional Datasets: Google Drive
Please refer to page 13 of the PDF document for the password to access the zipped file of the UCR archive (128 datasets).
To pre-train UniShape on the large-scale dataset:
python unishape_pretrain.py --label_ratio 0.10 --your_args_hereSee unishape_pretrain.py for all available arguments and detailed usage.
To fine-tune a pre-trained UniShape model for downstream classification:
python unishape_finetune.py --dataset CBF --your_args_hereRefer to unishape_finetune.py for fine-tuning options and configuration examples.
To perform zero-shot classification using a pre-trained UniShape model:
python unishape_zeroshot.py --dataset Tools --your_args_hereFor full options, see unishape_zeroshot.py.
This codebase is inspired by the following repositories:
If you find this work useful, please cite our paper:
@inproceedings{liu2026unishape,
title={A Unified Shape-Aware Foundation Model for Time Series Classification},
author={Liu, Zhen and Wang, Yucheng and Li, Boyuan and Zheng, Junhao and Eldele, Emadeldeen and Wu, Min and Ma, Qianli},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
year={2026}
}