Skip to main content

MAGNet: Muti-scale Attention and Evolutionary Graph Structure for Long Sequence Time-Series Forecasting

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2023 (ICANN 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14259))

Included in the following conference series:

  • 1267 Accesses

Abstract

Long sequence time-series forecasting (LSTF) has been widely applied in various fields, such as electricity usage planning and financial long-term strategic guidance. However, LSTF faces challenges in capturing two different types of information: the temporal dependencies of individual features and the interdependencies among multiple features in multivariate time series forecasting. Graph neural networks (GNNs) are commonly used to reveal the correlations among feature variables using graph structures in multivariate forecasting. However, in LSTF, the interdependencies among variables are often dynamic and evolving. Therefore, in this paper, we propose a Multi-scale Attention and Evolutionary Graph Structure (MAGNet) framework to address these challenges. To capture the dynamic changes in interdependencies among variables, we design an evolutionary graph learning layer that constructs an adjacency matrix for each time step and uses gated recurrent units to model the changing correlations, thus learning dynamic feature graph structures. We also utilize graph convolutional modules to capture the dependencies in the learned feature graph structure. Furthermore, to capture the two types of information that the temporal dependencies of individual features and the interdependencies among multiple features, we propose a multi-scale temporal capturing module that incorporates channel attention and spatial attention. Finally, we compare and analyze our proposed method against several high-performance models on 6 real-world datasets. Experimental results demonstrate the efficiency of the proposed method. Code is available at this repository: https://github.com/Masterleia/MAGNet.

Supported by the National Natural Science Foundation of China (Grant Nos. 62202395, 62176221), Natural Science Foundation of Sichuan Province (Grant No. 2022NSFSC0930), Fundamental Research Funds for the Central Universities (Grant No. 2682022CX067, 2682023CF012), Science and Technology Project of Sichuan Province (No. 2022JDRC0067), and Natural Science Foundation of Hebei Province (Grant No. F2022105033).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    ETTh1: https://github.com/zhouhaoyi/ETDataset.

  2. 2.

    Electricity: https://archive.ics.uci.edu/ml/datasets/ElectricityLoadDiagrams20112014.

  3. 3.

    Exchange: https://drive.google.com/drive/folders/1ZOYpTUa82_jCcxIdTmyr0LXQfvaM9vIy.

  4. 4.

    Traffific: http://pems.dot.ca.gov.

  5. 5.

    Weather: https://www.bgc-jena.mpg.de/wetter/.

  6. 6.

    ILI: https://gis.cdc.gov/grasp/fluview/fluportaldashboard.html.

References

  1. Chen, W., Wang, W., Peng, B., Wen, Q., Zhou, T., Sun, L.: Learning to rotate: quaternion transformer for complicated periodical time series forecasting. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 146–156 (2022)

    Google Scholar 

  2. Cirstea, R.G., Guo, C., Yang, B., Kieu, T., Dong, X., Pan, S.: Triformer: triangular, variable-specific attentions for long sequence multivariate time series forecasting-full version. In: International Joint Conference on Artificial Intelligence (2022)

    Google Scholar 

  3. Edwards, R.D., Magee, J., Bassetti, W.C.: Technical Analysis of Stock Trends. CRC Press, Boca Raton (2018)

    Google Scholar 

  4. Fu, C., Nguyen, T.: Models for long-term energy forecasting. In: 2003 IEEE Power Engineering Society General Meeting (IEEE Cat. No. 03CH37491), vol. 1, pp. 235–239. IEEE (2003)

    Google Scholar 

  5. Han, L., Du, B., Sun, L., Fu, Y., Lv, Y., Xiong, H.: Dynamic and multi-faceted spatio-temporal deep learning for traffic speed forecasting. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 547–555 (2021)

    Google Scholar 

  6. Kitaev, N., Kaiser, L., Levskaya, A.: Reformer: the efficient transformer. In: Proceedings of the 8th International Conference on Learning Representations, ICLR. OpenReview.net (2020)

    Google Scholar 

  7. Lefrancois, R., Mamidipudi, P., Li, J.: Expectation risk: a novel short-term risk measure for long-term financial projections. Available at SSRN 3715727 (2020)

    Google Scholar 

  8. Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Adv. Neural. Inf. Process. Syst. 32, 5244–5254 (2019)

    Google Scholar 

  9. Liu, S., et al.: Pyraformer: low-complexity pyramidal attention for long-range time series modeling and forecasting. In: International Conference on Learning Representations (2021)

    Google Scholar 

  10. Minhao, L., et al.: SCINet: time series modeling and forecasting with sample convolution and interaction. In: Advances in Neural Information Processing Systems (2022)

    Google Scholar 

  11. Qu, L., Li, W., Li, W., Ma, D., Wang, Y.: Daily long-term traffic flow forecasting based on a deep neural network. Expert Syst. Appl. 121, 304–312 (2019)

    Article  Google Scholar 

  12. Vaswani, A., et al.: Attention is all you need. Adv. Neural. Inf. Process. Syst. 30, 5998–6008 (2017)

    Google Scholar 

  13. Ward, S.N.: Area-based tests of long-term seismic hazard predictions. Bull. Seismol. Soc. Am. 85(5), 1285–1298 (1995)

    Article  Google Scholar 

  14. Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)

    Google Scholar 

  15. Wu, Z., Pan, S., Long, G., Jiang, J., Chang, X., Zhang, C.: Connecting the dots: multivariate time series forecasting with graph neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 753–763 (2020)

    Google Scholar 

  16. Zeng, P., Hu, G., Zhou, X., Li, S., Liu, P., Liu, S.: Muformer: a long sequence time-series forecasting model based on modified multi-head attention. Knowl. Based Syst. 254, 109584 (2022)

    Article  Google Scholar 

  17. Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)

    Google Scholar 

  18. Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, ICML, vol. 162, pp. 27268–27286. PMLR (2022)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chongshou Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, Z., Zhang, F., Li, T., Li, C. (2023). MAGNet: Muti-scale Attention and Evolutionary Graph Structure for Long Sequence Time-Series Forecasting. In: Iliadis, L., Papaleonidas, A., Angelov, P., Jayne, C. (eds) Artificial Neural Networks and Machine Learning – ICANN 2023. ICANN 2023. Lecture Notes in Computer Science, vol 14259. Springer, Cham. https://doi.org/10.1007/978-3-031-44223-0_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-44223-0_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-44222-3

  • Online ISBN: 978-3-031-44223-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy