长时间序列预测是一个重要的问题,在许多领域都有应用,如天气预报、库存预测、石油产量预测和热负荷预测。近年来,最流行的长时间序列预测方法注重基于卷积神经网络(CNN)在单一尺度上提取局部信息。此外,这些方法利用基本注意力机制从先前的时间步长中选择相关信息以生成更好的输出。然而,长时间序列在不同尺度上具有丰富的信息,基本注意力机制不适合直接预测未来序列。在本文中,我们提出了一种在长短期记忆(LSTM)的隐藏状态下利用多尺度特征提取和序列到序列(seq2seq)注意机制的长时间序列预测方法,称为MS-LSTM。具体来说,MS-LSTM 受到 seq2seq 注意力机制的启发,通过充分利用先前的信息来生成输出序列。多尺度特征提取利用具有不同卷积核的 CNN 来提取不同尺度的短期特征。汇率数据集的比较和消融实验表明,所提出的方法比几种最先进的方法取得了显着的改进。我们还将提出的模型应用于上海周邦信息技术公司提供的工业设备指标时间序列数据集,

Long time series forecasting is an important problem with applications in many fields, such as weather forecasting, stock prediction, petroleum production prediction and heating load forecasting. In recent years, the most popular methods for long time series forecasting pay attention to extract local information at a single scale based on Convolutional Neural Network (CNN). Moreover, these methods utilize the basic attention mechanism to select the relevant information from previous time steps for generating the better outputs. However, long time series have rich information at different scales and basic attention mechanism is not appropriate for directly predicting a future sequence. In this paper, we propose a long time series forecasting method by utilizing Multi-scale feature extraction and Sequence-to-sequence (seq2seq) attention mechanism in the hidden state of Long Short-Term Memory (LSTM), which is named MS-LSTM. Concretely, MS-LSTM is inspired by seq2seq attention mechanism to generate the output sequences by making the best of previous information. Multiscale feature extraction utilizes CNNs with different convolution kernels to extract short-term features at different scales. The comparison and ablation experiments on the exchange rate dataset show that the proposed method achieves significant improvements over that of several state-of-the-art methods. We also apply the proposed model on a time series dataset of industrial equipment indexes provided by Shanghai Zhoubang Information Technology Company, and achieve state-of-the-art results in all cases.