Eghbal commited on
Commit
38ec1a7
Β·
verified Β·
1 Parent(s): 07f3442

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -8
README.md CHANGED
@@ -14,7 +14,7 @@ pinned: false
14
 
15
  <div style="padding: 12px; border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
16
 
17
- ## πŸš€ Stage 1 Release
18
 
19
  We are pleased to introduce **FinText-TSFM**, a comprehensive suite of **time series foundation models (TSFMs)** developed for financial forecasting and quantitative research.
20
  This release accompanies the paper
@@ -35,7 +35,6 @@ by *Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025)*.
35
  This release includes variants of **Chronos** and **TimesFM** architectures adapted for financial time series:
36
  - Chronos-Tiny / Mini / Small
37
  - TimesFM-8M / 20M
38
- - Parameter counts range from **8M to 200M+**.
39
 
40
  - **Performance Insights:**
41
  Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio Sharpe ratios.
@@ -43,8 +42,6 @@ by *Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025)*.
43
  - **Evaluation Scope:**
44
  Models are benchmarked across **U.S. and international equities**, using rolling windows (5, 21, 252, 512 days) and **18M+ out-of-sample forecasts**.
45
 
46
- - **Open Science Commitment:**
47
- All released models are available in **FP32** format for full transparency and reproducibility.
48
 
49
  ---
50
 
@@ -52,9 +49,7 @@ by *Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025)*.
52
 
53
  - **Architecture:** Transformer-based TSFMs (Chronos & TimesFM)
54
  - **Training Regime:** Pre-training from scratch, fine-tuning, and zero-shot evaluation
55
- - **Objective:** Mean squared error (MSE) for continuous returns; cross-entropy for tokenized sequences
56
  - **Compute:** >50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters
57
- - **Data Sources:** CRSP, Compustat Global, JKP factors, and proprietary merged panels (1990–2023)
58
 
59
  ---
60
 
@@ -74,11 +69,12 @@ Please cite the accompanying paper if you use these models:
74
 
75
  This project was made possible through computational and institutional support from:
76
  - **Isambard-AI National AI Research Resource (AIRR)**
77
- - **The University of Manchester** (Research IT & Computational Shared Facility)
 
78
  - **N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)** β€” EPSRC Grant EP/T022167/1
79
  - **University College London** and **Shanghai University**
80
  - **The Alan Turing Institute**
81
- - **Alliance Manchester Business School (AMBS)**
82
 
83
  ---
84
 
 
14
 
15
  <div style="padding: 12px; border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
16
 
17
+ ## πŸš€ TSFMs Release
18
 
19
  We are pleased to introduce **FinText-TSFM**, a comprehensive suite of **time series foundation models (TSFMs)** developed for financial forecasting and quantitative research.
20
  This release accompanies the paper
 
35
  This release includes variants of **Chronos** and **TimesFM** architectures adapted for financial time series:
36
  - Chronos-Tiny / Mini / Small
37
  - TimesFM-8M / 20M
 
38
 
39
  - **Performance Insights:**
40
  Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio Sharpe ratios.
 
42
  - **Evaluation Scope:**
43
  Models are benchmarked across **U.S. and international equities**, using rolling windows (5, 21, 252, 512 days) and **18M+ out-of-sample forecasts**.
44
 
 
 
45
 
46
  ---
47
 
 
49
 
50
  - **Architecture:** Transformer-based TSFMs (Chronos & TimesFM)
51
  - **Training Regime:** Pre-training from scratch, fine-tuning, and zero-shot evaluation
 
52
  - **Compute:** >50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters
 
53
 
54
  ---
55
 
 
69
 
70
  This project was made possible through computational and institutional support from:
71
  - **Isambard-AI National AI Research Resource (AIRR)**
72
+ - **The University of Manchester** (Research IT & Computational Shared Facility)
73
+ - **Alliance Manchester Business School (AMBS), University of Manchester**
74
  - **N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)** β€” EPSRC Grant EP/T022167/1
75
  - **University College London** and **Shanghai University**
76
  - **The Alan Turing Institute**
77
+
78
 
79
  ---
80