Rankformer: Leveraging Rank Correlation for Transformer-based Time Series Forecasting - Modelisation Systemes Langages
Communication Dans Un Congrès Année : 2023

Rankformer: Leveraging Rank Correlation for Transformer-based Time Series Forecasting

Résumé

Long-term forecasting problem for time series has been actively studied during the last several years, and preceding Transformer-based models have exploited various self-attention mechanisms to discover the long-range dependencies. However, the hidden dependencies required by the forecasting task are not always appropriately extracted, especially the nonlinear serial dependencies in some datasets. In this paper, we propose a novel Transformer-based model, namely Rankformer, leveraging the rank correlation function and decomposition architecture for long-term time series forecasting tasks. Rankformer outperforms four state-of-the-art Transformer-based models and two RNNbased models for different forecasting horizons on different datasets on which extensive experiments were conducted.
Fichier principal
Vignette du fichier
main.pdf (238.39 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04110209 , version 1 (30-05-2023)

Identifiants

  • HAL Id : hal-04110209 , version 1

Citer

Zuokun Ouyang, Meryem Jabloun, Philippe Ravier. Rankformer: Leveraging Rank Correlation for Transformer-based Time Series Forecasting. IEEE Statistical Signal Processing Workshop, Jul 2023, Hanoi, Vietnam. ⟨hal-04110209⟩
238 Consultations
321 Téléchargements

Partager

More