论文标题
W-Transformers:一个基于小波的变压器框架,用于单变量时间序列预测
W-Transformers : A Wavelet-based Transformer Framework for Univariate Time Series Forecasting
论文作者
论文摘要
使用变压器的深度学习最近在许多重要领域取得了很大的成功,例如自然语言处理,计算机视觉,异常检测和推荐系统等。在变压器的几种优点中,对于时间序列预测,捕获长期时间依赖性和相互作用的能力是可取的,从而导致其在各种时间序列应用中的进步。在本文中,我们为非平稳时间序列构建了变压器模型。这个问题具有挑战性,但至关重要。我们为基于小波的变压器编码器体系结构进行单变量时间序列表示学习提供了一个新颖的框架,并将其称为W-Transformer。所提出的W-Transformer使用最大重叠离散小波转换(MODWT)到时间序列数据,并在分解数据集中构建本地变压器,以生动地捕获时间序列中的非机构性和远程非线性依赖性。评估我们在来自各个领域的几个公开基准时间序列数据集以及具有不同特征的框架上,我们证明,它的平均表现明显优于短期和长期预测的基线预报员,即使是仅由仅由几百个培训样本组成的数据集。
Deep learning utilizing transformers has recently achieved a lot of success in many vital areas such as natural language processing, computer vision, anomaly detection, and recommendation systems, among many others. Among several merits of transformers, the ability to capture long-range temporal dependencies and interactions is desirable for time series forecasting, leading to its progress in various time series applications. In this paper, we build a transformer model for non-stationary time series. The problem is challenging yet crucially important. We present a novel framework for univariate time series representation learning based on the wavelet-based transformer encoder architecture and call it W-Transformer. The proposed W-Transformers utilize a maximal overlap discrete wavelet transformation (MODWT) to the time series data and build local transformers on the decomposed datasets to vividly capture the nonstationarity and long-range nonlinear dependencies in the time series. Evaluating our framework on several publicly available benchmark time series datasets from various domains and with diverse characteristics, we demonstrate that it performs, on average, significantly better than the baseline forecasters for short-term and long-term forecasting, even for datasets that consist of only a few hundred training samples.