--- title: FinText-TSFM emoji: 📈 colorFrom: gray colorTo: blue sdk: static pinned: false --- [![SSRN](https://img.shields.io/badge/SSRN-5770562-1a5dab?logo=ssrn&logoColor=white)](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562) [![arXiv](https://img.shields.io/badge/arXiv-2511.18578-b31b1b?logo=arxiv&logoColor=white)](https://www.arxiv.org/abs/2511.18578) [![ResearchGate](https://img.shields.io/badge/ResearchGate-Paper-00CCBB?logo=researchgate&logoColor=white)](https://www.researchgate.net/publication/397872068_ReVisiting_Time_Series_Foundation_Models_in_Finance) [![Website - FinText.ai](https://img.shields.io/badge/Website-FinText.ai-0A66C2?logo=google-chrome&logoColor=white)](https://fintext.ai) [![GitHub - FinText.ai](https://img.shields.io/badge/GitHub-FinText.ai-181717?logo=github&logoColor=white)](https://github.com/DeepIntoStreams/TSFM_Finance) ## 🎤 Podcast You can now listen to the accompanying podcast here: https://soundcloud.com/eghbal-rahimikia/revisiting-time-series-foundation-models-in-finance ## 🆕 GitHub Model Loading Support (NEW) All models can now be loaded directly from GitHub. The repository includes utilities and setup instructions. 🔗 **https://github.com/DeepIntoStreams/TSFM_Finance** ## 🚀 TSFMs Release We are pleased to introduce **FinText-TSFM**, a comprehensive suite of **time series foundation models (TSFMs)** with 613 models pre-trained for quantitative finance. This release accompanies the paper : **[*Re(Visiting) Time Series Foundation Models in Finance*](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562)** by *Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025)*. ### 💡 Key Highlights - **Finance-Native Pre-training:** Models are pre-trained **from scratch** on large-scale financial time series datasets — including daily excess returns across **89 markets** and **over 2 billion observations** — to ensure full temporal and domain alignment. - **Bias-Free Design:** Pre-training strictly follows a **chronological expanding-window setup**, avoiding any **look-ahead bias** or **information leakage**.
Each variation includes 23 separately pre-trained models, corresponding to each year from **2000** to **2023**, with data starting in 1990. - **Model Families:** This release includes variants of **Chronos** and **TimesFM** architectures adapted for financial time series: - Chronos-Tiny (8M) / Mini (20M) / Small (46M) - TimesFM-8M / 20M - **Model Collections:** - U.S.: Covers **U.S.** market-wide excess returns from 2000 to 2023, with one pre-trained model per year. - Global: Covers excess returns across **94 global markets** from 2000 to 2023, with one pre-trained model for each year. - Augmented: Extends the global data with **augmented factors** from 2000 to 2023, with one pre-trained model for each year. - The remaining **253 pre-trained models** are available for download via the [**FinText.ai Portal**](https://fintext.ai). These include models pre-trained with varying **hyperparameter configurations** for extended experimentation and performance comparison. - **Performance Insights:** Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio performance. - **Evaluation Scope:** Models are benchmarked across **U.S. and seven international markets**, using rolling windows of **5, 21, 252, and 512 days**, with over **18 million out-of-sample forecasts** spanning **22 years (2001–2023)** of daily excess returns, evaluated at both the **statistical** and **economic performance** levels. ### 🧠 Technical Overview - **Architecture:** Transformer-based TSFMs (Chronos & TimesFM) - **Compute:** 50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters ### 📚 Citation Please cite the accompanying paper if you use these models: > **Re(Visiting) Time Series Foundation Models in Finance.** > **Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.** > SSRN: [https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562) ### 🔋 Acknowledgments This project was made possible through computational and institutional support from: - **UK Research and Innovation (UKRI)** - **Isambard-AI National AI Research Resource (AIRR)** - **Alliance Manchester Business School (AMBS), University of Manchester** - **N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)** - **The University of Manchester** (Research IT & Computational Shared Facility) - **University College London (UCL)** - **The Alan Turing Institute** - **Shanghai University** ---

Developed by:

University of Manchester Logo UCL Logo

Alliance Manchester Business School, University of Manchester
Department of Mathematics, University College London (UCL)

Powered by:

BriCS Logo N8 Bede Logo

Isambard-AI, Bristol Centre for Supercomputing (BriCS)
The Bede Supercomputer