Rankformer: Leveraging Rank Correlation for Transformer-based Time Series Forecasting
Abstract
Long-term forecasting problem for time series has been actively studied during the last several years, and preceding Transformer-based models have exploited various self-attention mechanisms to discover the long-range dependencies. However, the hidden dependencies required by the forecasting task are not always appropriately extracted, especially the nonlinear serial dependencies in some datasets. In this paper, we propose a novel Transformer-based model, namely Rankformer, leveraging the rank correlation function and decomposition architecture for long-term time series forecasting tasks. Rankformer outperforms four state-of-the-art Transformer-based models and two RNNbased models for different forecasting horizons on different datasets on which extensive experiments were conducted.
Origin | Files produced by the author(s) |
---|