Abstract
Long sequence time-series forecasting (LSTF) has been widely applied in various fields, such as electricity usage planning and financial long-term strategic guidance. However, LSTF faces challenges in capturing two different types of information: the temporal dependencies of individual features and the interdependencies among multiple features in multivariate time series forecasting. Graph neural networks (GNNs) are commonly used to reveal the correlations among feature variables using graph structures in multivariate forecasting. However, in LSTF, the interdependencies among variables are often dynamic and evolving. Therefore, in this paper, we propose a Multi-scale Attention and Evolutionary Graph Structure (MAGNet) framework to address these challenges. To capture the dynamic changes in interdependencies among variables, we design an evolutionary graph learning layer that constructs an adjacency matrix for each time step and uses gated recurrent units to model the changing correlations, thus learning dynamic feature graph structures. We also utilize graph convolutional modules to capture the dependencies in the learned feature graph structure. Furthermore, to capture the two types of information that the temporal dependencies of individual features and the interdependencies among multiple features, we propose a multi-scale temporal capturing module that incorporates channel attention and spatial attention. Finally, we compare and analyze our proposed method against several high-performance models on 6 real-world datasets. Experimental results demonstrate the efficiency of the proposed method. Code is available at this repository: https://github.com/Masterleia/MAGNet.
Supported by the National Natural Science Foundation of China (Grant Nos. 62202395, 62176221), Natural Science Foundation of Sichuan Province (Grant No. 2022NSFSC0930), Fundamental Research Funds for the Central Universities (Grant No. 2682022CX067, 2682023CF012), Science and Technology Project of Sichuan Province (No. 2022JDRC0067), and Natural Science Foundation of Hebei Province (Grant No. F2022105033).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
Traffific: http://pems.dot.ca.gov.
- 5.
Weather: https://www.bgc-jena.mpg.de/wetter/.
- 6.
References
Chen, W., Wang, W., Peng, B., Wen, Q., Zhou, T., Sun, L.: Learning to rotate: quaternion transformer for complicated periodical time series forecasting. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 146–156 (2022)
Cirstea, R.G., Guo, C., Yang, B., Kieu, T., Dong, X., Pan, S.: Triformer: triangular, variable-specific attentions for long sequence multivariate time series forecasting-full version. In: International Joint Conference on Artificial Intelligence (2022)
Edwards, R.D., Magee, J., Bassetti, W.C.: Technical Analysis of Stock Trends. CRC Press, Boca Raton (2018)
Fu, C., Nguyen, T.: Models for long-term energy forecasting. In: 2003 IEEE Power Engineering Society General Meeting (IEEE Cat. No. 03CH37491), vol. 1, pp. 235–239. IEEE (2003)
Han, L., Du, B., Sun, L., Fu, Y., Lv, Y., Xiong, H.: Dynamic and multi-faceted spatio-temporal deep learning for traffic speed forecasting. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 547–555 (2021)
Kitaev, N., Kaiser, L., Levskaya, A.: Reformer: the efficient transformer. In: Proceedings of the 8th International Conference on Learning Representations, ICLR. OpenReview.net (2020)
Lefrancois, R., Mamidipudi, P., Li, J.: Expectation risk: a novel short-term risk measure for long-term financial projections. Available at SSRN 3715727 (2020)
Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Adv. Neural. Inf. Process. Syst. 32, 5244–5254 (2019)
Liu, S., et al.: Pyraformer: low-complexity pyramidal attention for long-range time series modeling and forecasting. In: International Conference on Learning Representations (2021)
Minhao, L., et al.: SCINet: time series modeling and forecasting with sample convolution and interaction. In: Advances in Neural Information Processing Systems (2022)
Qu, L., Li, W., Li, W., Ma, D., Wang, Y.: Daily long-term traffic flow forecasting based on a deep neural network. Expert Syst. Appl. 121, 304–312 (2019)
Vaswani, A., et al.: Attention is all you need. Adv. Neural. Inf. Process. Syst. 30, 5998–6008 (2017)
Ward, S.N.: Area-based tests of long-term seismic hazard predictions. Bull. Seismol. Soc. Am. 85(5), 1285–1298 (1995)
Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)
Wu, Z., Pan, S., Long, G., Jiang, J., Chang, X., Zhang, C.: Connecting the dots: multivariate time series forecasting with graph neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 753–763 (2020)
Zeng, P., Hu, G., Zhou, X., Li, S., Liu, P., Liu, S.: Muformer: a long sequence time-series forecasting model based on modified multi-head attention. Knowl. Based Syst. 254, 109584 (2022)
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)
Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, ICML, vol. 162, pp. 27268–27286. PMLR (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Chen, Z., Zhang, F., Li, T., Li, C. (2023). MAGNet: Muti-scale Attention and Evolutionary Graph Structure for Long Sequence Time-Series Forecasting. In: Iliadis, L., Papaleonidas, A., Angelov, P., Jayne, C. (eds) Artificial Neural Networks and Machine Learning – ICANN 2023. ICANN 2023. Lecture Notes in Computer Science, vol 14259. Springer, Cham. https://doi.org/10.1007/978-3-031-44223-0_18
Download citation
DOI: https://doi.org/10.1007/978-3-031-44223-0_18
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-44222-3
Online ISBN: 978-3-031-44223-0
eBook Packages: Computer ScienceComputer Science (R0)