go back
go back
Volume 18, No. 2
A Memory Guided Transformer for Time Series Forecasting
Abstract
Accurate long-term forecasting from multivariate time series has important real-world applications. However, achieving this so is challenging. Thus, analyses reveal that time series that span long durations often exhibit dynamic and disrupted correlations. State-of-durations often exhibit dynamic and disrupted correlations. State-ofthe-art methods employ attention mechanisms to capture dynamic correlations, but they often do not contend well with disrupted correlations, which reduces prediction accuracy. We introduce local and global information concepts and then leverage these in a Mem-and global information concepts and then leverage these in a Memory Guided Transformer, called the Memformer. By integrating patch-wise recurrent graph learning and global attention, the Mem-patch-wise recurrent graph learning and global attention, the Memformer aims to capture dynamic correlations and take disrupted correlations into account. We also integrate a so-called Alternating Memory Enhancer into the Memformer to capture correlations be-Memory Enhancer into the Memformer to capture correlations between local and global information. We report on experiments that offer insight into the effectiveness of the Memformer at capturing dynamic correlations and its robustness to disrupted correlations. The experiments offer evidence that the new method is capable of advancing the state-of-the-art in forecasting accuracy on real-world datasets.
PVLDB is part of the VLDB Endowment Inc.
Privacy Policy