Eghbal commited on
Commit
a81f137
·
verified ·
1 Parent(s): 042d787

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -35,7 +35,8 @@ We are pleased to introduce **FinText-TSFM**, a comprehensive suite of **time se
35
  - U.S.: Covers **U.S.** market-wide excess returns from 2000 to 2023, with one pre-trained model per year.
36
  - Global: Covers excess returns across **89 global markets** from 2000 to 2023, with one pre-trained model for each year.
37
  - Augmented: Extends the global data with **augmented factors** from 2000 to 2023, with one pre-trained model for each year.
38
- - The remaining TSFMs are available for download via the [**FinText.ai Portal**](https://fintext.ai).
 
39
 
40
  - **Performance Insights:**
41
  Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio performance.
 
35
  - U.S.: Covers **U.S.** market-wide excess returns from 2000 to 2023, with one pre-trained model per year.
36
  - Global: Covers excess returns across **89 global markets** from 2000 to 2023, with one pre-trained model for each year.
37
  - Augmented: Extends the global data with **augmented factors** from 2000 to 2023, with one pre-trained model for each year.
38
+ - The remaining **220 pre-trained models** are available for download via the [**FinText.ai Portal**](https://fintext.ai). These include models fine-tuned with varying **hyperparameter configurations** for extended experimentation and performance comparison.
39
+
40
 
41
  - **Performance Insights:**
42
  Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio performance.