EcomSalesTrendPredictor
π Overview
EcomSalesTrendPredictor is a Time Series Transformer model designed for multivariate forecasting of e-commerce sales performance metrics. Specifically, it is trained to predict future Revenue_USD and UnitsSold (implicitly handled as separate series during training, or focused on one primary series like Revenue_USD as input_size=1 for simplicity) for multiple product SKUs across various regions.
The model incorporates numerous real-world contextual features, including static (e.g., product category, region) and dynamic (e.g., promotion status, season, inventory level) variables, making it highly robust for complex supply chain and financial planning tasks.
π§ Model Architecture
This model utilizes the Time Series Transformer (TST) architecture, which is state-of-the-art for sequence modeling tasks due to its use of self-attention mechanisms.
- Model Type:
TimeSeriesTransformerModel(from HuggingFace's transformers/pytorch-forecasting implementation). - Input: Multiple time series (each SKU/Region combination is a series) of historical data for the target metric (
Revenue_USDorUnitsSold). - Context/Prediction: Uses a
context_lengthof 30 days of historical data to predict the nextprediction_lengthof 7 days. - Feature Integration:
- Static Categorical: Region, ProductCategory, SKU (embedded).
- Static Real: Log of average CustomerRating, Shipping Cost.
- Dynamic Real: Inventory_Level, DaysSinceLastRestock, InventoryRiskScore.
- Dynamic Categorical: PromotionApplied, Season.
- Output: The model outputs a set of quantiles (0.1, 0.5, 0.9) for the forecast, providing an uncertainty range rather than a single point estimate.
π Intended Use
- Sales Forecasting: Predict weekly revenue and unit sales for better financial planning.
- Inventory Optimization: Use the 7-day forecast to trigger restocking orders, minimizing stockouts (high
InventoryRiskScore) or excess inventory. - Demand Planning: Analyze the impact of dynamic features (promotions, season) on future demand.
- Multi-Region Strategy: Compare and predict performance across different geographic regions simultaneously.
β οΈ Limitations
- Data Density: Performance may degrade if the input time series contains large gaps or is highly irregular.
- External Shocks: Like all time-series models, it cannot predict sudden, unforeseen external events (e.g., pandemics, major news events) that significantly disrupt market patterns.
- Computational Cost: Transformer-based models are more computationally expensive than simpler models (like ARIMA) for both training and inference.
- SKU Limit: The model is implicitly limited by the SKU cardinality it was trained on; adding entirely new products requires retraining or fine-tuning.
π» Example Code
To use the model for forecasting (requires a compatible time series library like pytorch-forecasting):
import pandas as pd
from transformers import AutoModel
from pytorch_forecasting import TimeSeriesDataSet, DeepAR
# NOTE: Actual inference with TST requires full PyTorch Forecasting setup.
# This example illustrates the data preparation steps.
model_name = "your-username/EcomSalesTrendPredictor" # Replace with actual HuggingFace path
# model = AutoModel.from_pretrained(model_name)
# Example historical data for one series (truncated for simplicity)
data = {
'time_idx': [1, 2, 3, 4, 5],
'target': [34995.0, 3600.0, 937.5, 18750.0, 2700.0],
'series': ['EL-LAP-001'] * 5,
'Region': ['North America'] * 5,
'ProductCategory': ['Electronics'] * 5,
'UnitsSold': [45, 180, 75, 15, 90],
'Inventory_Level': [120, 500, 90, 40, 300],
'PromotionApplied': [0, 1, 0, 0, 1]
}
df = pd.DataFrame(data)
# The loaded model object expects a TimeSeriesDataSet object for inference.
# The TST is highly dependent on the correct feature schema defined in its config.
print(f"Model configured for a prediction length of {model_config.prediction_length} days.")
print("Inference requires pre-processing the data into a TimeSeriesDataSet format.")
- Downloads last month
- 17