Spaces:
Running
title: FinText-TSFM
emoji: π
colorFrom: gray
colorTo: blue
sdk: static
pinned: false
FinText-TSFM π
Time Series Foundation Models for Finance
π TSFMs Release
We are pleased to introduce FinText-TSFM, a comprehensive suite of time series foundation models (TSFMs) developed for financial forecasting and quantitative research.
This release accompanies the paper
Re(Visiting) Time Series Foundation Models in Finance
by Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025).
π‘ Key Highlights
Finance-Native Pre-training:
Models are pre-trained from scratch on large-scale financial time series datasets β including daily excess returns across 89 markets and over 2 billion observations β to ensure full temporal and domain alignment.Bias-Free Design:
Training strictly follows a chronological expanding-window setup, avoiding any look-ahead bias or information leakage.Model Families:
This release includes variants of Chronos and TimesFM architectures adapted for financial time series:- Chronos-Tiny / Mini / Small
- TimesFM-8M / 20M
Performance Insights:
Our findings show that off-the-shelf TSFMs underperform in zero-shot forecasting, while finance-pretrained models achieve large gains in both predictive accuracy and portfolio Sharpe ratios.Evaluation Scope:
Models are benchmarked across U.S. and international equities, using rolling windows (5, 21, 252, 512 days) and 18M+ out-of-sample forecasts.
π§ Technical Overview
- Architecture: Transformer-based TSFMs (Chronos & TimesFM)
- Training Regime: Pre-training from scratch, fine-tuning, and zero-shot evaluation
- Compute: >50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters
π Citation
Please cite the accompanying paper if you use these models:
Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.
Re(Visiting) Time Series Foundation Models in Finance.
University of Manchester, UCL, Shanghai University, November 2025.
SSRN: https://ssrn.com/abstract=4963618
DOI: 10.2139/ssrn.4963618
π Acknowledgments
This project was made possible through computational and institutional support from:
- Isambard-AI National AI Research Resource (AIRR)
- The University of Manchester (Research IT & Computational Shared Facility)
- Alliance Manchester Business School (AMBS), University of Manchester
- N8 Centre of Excellence in Computationally Intensive Research (N8 CIR) β EPSRC Grant EP/T022167/1
- University College London and Shanghai University
- The Alan Turing Institute
Developed by:
Alliance Manchester Business School
ποΈ Update (November 2025):
Public models for Stage 1 are available. Future stages will introduce larger-scale TSFMs, multivariate extensions, and diffusion-based financial forecasting models.