PatchTSMixer added to HuggingFace Transformers for lightweight time-series forecasting
AI Impact Summary
PatchTSMixer is integrated into HuggingFace Transformers, enabling a lightweight, patch-based MLP-Mixer approach for multivariate time-series forecasting. It supports pretraining and transfer learning via masked pretraining and multiple attention variants, with reported improvements in memory and runtime (2-3x) and noteworthy accuracy gains over comparable Patch-Transformer baselines on tasks like Electricity forecasting and ETTH2 transfer. Deployment will require using PatchTSMixerConfig and PatchTSMixerForPrediction and installing the tsfm package alongside transformers.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info