[1]FAN Hangzhou,MEI Hongyan,ZHAO Qin,et al.Multivariate time series forecasting with a graph neural network and dual attention mechanism[J].CAAI Transactions on Intelligent Systems,2024,19(5):1277-1286.[doi:10.11992/tis.202305020]
Copy

Multivariate time series forecasting with a graph neural network and dual attention mechanism

References:
[1] GASPARIN A, LUKOVIC S, ALIPPI C. Deep learning for time series forecasting: The electric load case[J]. CAAI transactions on intelligence technology, 2022, 7(1): 1-25.
[2] 王锋华, 成敬周, 文凡. 快速双非凸回归算法及其电力数据预测应用[J]. 智能系统学报, 2018, 13(4): 665-672.
WANG Fenghua, CHENG Jingzhou, WEN Fan. Fast double nonconvex regression algorithm for forecast of electric power data[J]. CAAI transactions on intelligent systems, 2018, 13(4): 665-672.
[3] 程鹏超, 杜军平, 薛哲. 基于多路交叉的用户金融行为预测[J]. 智能系统学报, 2021, 16(2): 378-384.
CHENG Pengchao, DU Junping, XUE Zhe. Prediction of user financial behavior based on multi-way crossing[J]. CAAI transactions on intelligent systems, 2021, 16(2): 378-384.
[4] FANG Weiwei, ZHUO Wenhao, YAN Jingwen, et al. Attention meets long short-term memory: a deep learning network for traffic flow forecasting[J]. Physica A: statistical mechanics and its applications, 2022, 587: 126485.
[5] 李伯涵, 郭茂祖, 赵玲玲. 基于分割注意力机制残差网络的城市区域客流量预测[J]. 智能系统学报, 2022, 17(4): 839-848.
LI Bohan, GUO Maozu, ZHAO Lingling. Passenger flow prediction in urban areas based on residual networks with split attention mechanism[J]. CAAI transactions on intelligent systems, 2022, 17(4): 839-848.
[6] YULE G U. On a method of investigating periodicities disturbed series, with special reference to Wolfer’s sunspot numbers[J]. Philosophical transactions of the royal society of London series A, containing papers of a mathematical or physical character, 1927, 226(1): 267-298.
[7] WALKER G T. On periodicity in series of related terms[J]. Proceedings of the royal society of London series A, containing papers of a mathematical and physical character, 1931, 131(818): 518-532.
[8] JENKINS G M, BOX G E P. Time series analysis: forecasting, and control[J]. Journal of time, 1976, 31(2): 238-242.
[9] 杨海民, 潘志松, 白玮. 时间序列预测方法综述[J]. 计算机科学, 2019, 46(1): 21-28.
YANG Haimin, PAN Zhisong, BAI Wei. Review of time series prediction methods[J]. Computer science, 2019, 46(1): 21-28.
[10] 陈艳, 王子健, 赵泽, 等. 传感器网络环境监测时间序列数据的高斯过程建模与多步预测[J]. 通信学报, 2015, 36(10): 252-262.
CHEN Yan, WANG Zijian, ZHAO Ze, et al. Gaussian process modeling and multi-step prediction for time series data in wireless sensor network environmental monitoring[J]. Journal on communications, 2015, 36(10): 252-262.
[11] LI Peixian, TAN Zhixiang, YAN Lili, et al. Time series prediction of mining subsidence based on a SVM[J]. Mining science and technology (China), 2011, 21(4): 557-562.
[12] GOSSé J B, GUILLAUMIN C. L’apport de la représentation VAR de Christopher A. Sims à la science économique[J]. L’économique en perspective, 2014, 89(4): 305-319.
[13] ELMAN J L. Finding structure in time[J]. Cognitive science, 1990, 14(2): 179-211.
[14] HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural computation, 1997, 9(8): 1735-1780.
[15] CHUNG J, GULCEHRE C, CHO K, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[EB/OL]. (2014-12-11)[2023-05-20]. https://arxiv.org/abs/1412.3555.
[16] QIN Yao, SONG Dongjin, CHEN Haifeng, et al. A dual-stage attention-based recurrent neural network for time series prediction[EB/OL]. (2017-04-07)[2023-05-20]. https://arxiv.org/abs/1704.02971.
[17] LAI Guokun, CHANG Weicheng, YANG Yiming, et al. Modeling long and short-term temporal patterns with deep neural networks[C]//The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. Ann Arbor: ACM, 2018: 95-104.
[18] HUANG Siteng, WANG Donglin, WU Xuehan, et al. DSANet: dual self-attention network for multivariate time series forecasting[C]//Proceedings of the 28th ACM International Conference on Information and Knowledge Management. Beijing: ACM, 2019: 2129-2132.
[19] BAI Shaojie, KOLTER J Z, KOLTUN V, et al. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling[EB/OL]. (2018-03-04)[2023-05-20]. https://arxiv.org/abs/1803.01271.
[20] LIU Minhao, ZENG Ailing, CHEN Muxi, et al. SCINet: Time series modeling and forecasting with sample convolution and interaction[J]. Advances in neural information processing systems, 2022, 35: 5816-5828.
[21] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. California: ACM, 2017: 6000–6010.
[22] WU N, GREEN B, BEN Xue, et al. Deep transformer models for time series forecasting: the influenza prevalence case[EB/OL]. (2020-01-23)[2023-05-20]. https://arxiv.org/abs/2001.08317.
[23] LI Shiyang, JIN Xiaoyong, XUAN Yao, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[EB/OL]. (2019-07-29)[2023-05-20]. https://arxiv.org/abs/1907.00235.
[24] ZHOU Haoyi, ZHANG Shanghang, PENG Jieqi, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[J]. Proceedings of the AAAI conference on artificial intelligence, 2021, 35(12): 11106-11115.
[25] WU Haixu, XU Jiehui, WANG Jianmin, et al. Autoformer: decomposition transformers with auto-correlation for long-term series forecasting[J]. Advances in neural information processing systems, 2021, 34: 22419-22430.
[26] ZHOU Tian, MA Ziqing, WEN Qingsong, et al. FEDformer: frequency enhanced decomposed transformer for long-term series forecasting[EB/OL]. (2022-01-30)[2023-05-20]. https://arxiv.org/abs/2201.12740.
[27] 吴博, 梁循, 张树森, 等. 图神经网络前沿进展与应用[J]. 计算机学报, 2022, 45(1): 35-68.
WU Bo, LIANG Xun, ZHANG Shusen, et al. Advances and applications in graph neural network[J]. Chinese journal of computers, 2022, 45(1): 35-68.
[28] WU Zonghan, PAN Shirui, LONG Guodong, et al. Graph WaveNet for deep spatial-temporal graph modeling[EB/OL]. (2019-05-31)[2023-05-20]. https://arxiv.org/abs/1906.00121.
[29] SATORRAS V G, RANGAPURAM S S, JANUSCHOWSKI T. Multivariate time series forecasting with latent graph inference[EB/OL]. (2022-03-07)[2023-05-20]. https://arxiv.org/abs/2203.03423.
[30] WU Zonghan, PAN Shirui, LONG Guodong, et al. Connecting the dots: multivariate time series forecasting with graph neural networks[C]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. Virtual Event: ACM, 2020: 753-763.
[31] WOO S, PARK J, LEE J Y, et al. CBAM: convolutional block attention module[C]//Lecture Notes in Computer Science. Cham: Springer, 2018: 3-19.
[32] ZHANG G P. Time series forecasting using a hybrid ARIMA and neural network model[J]. Neurocomputing, 2003, 50: 159-175.
[33] JIA Pengtao, LIU Hangduo, WANG Sujian, et al. Research on a mine gas concentration forecasting model based on a GRU network[J]. IEEE access, 2020, 8: 38023-38031.
[34] YOO J, KANG U. Attention-based autoregression for accurate and efficient multivariate time series forecasting[C]//Proceedings of the 2021 SIAM International Conference on Data Mining. Philadelphia: Society for Industrial and Applied Mathematics, 2021: 531-539.
Similar References:

Memo

-

Last Update: 2024-09-05

Copyright © CAAI Transactions on Intelligent Systems