中文核心期刊J* E* C* N* U* N* S* ›› 2025, Vol. 2025 ›› Issue (1): 82-96.doi: 10.3969/j.issn.1000-5641.2025.01.007
• Computer Science • Previous Articles Next Articles
Chaojie MEN1, Jing ZHAO1,*(
), Nan ZHANG2
Received:2023-12-27
Online:2025-01-25
Published:2025-01-20
Contact:
Jing ZHAO
E-mail:jzhao@cs.ecnu.edu.cn
CLC Number:
Chaojie MEN, Jing ZHAO, Nan ZHANG. Time series uncertainty forecasting based on graph augmentation and attention mechanism[J]. J* E* C* N* U* N* S*, 2025, 2025(1): 82-96.
Table 1
Statistical information of datasets"
| 数据集 | 采集率 | 维度 | 数据量 | 训练集 | 验证集 | 测试集 | 预测步长 |
| ETTh1 | 次/h | 7 | {24,48,168,336,720} | ||||
| ETTm1 | 次/15 min | 7 | {24,48,168,288,672} | ||||
| WTH | 次/h | 12 | {24,48,168,336,720} | ||||
| ECL | 次/h | 321 | {24,48,168,336,720} | ||||
| ILI | 次/周 | 7 | 845 | 601 | 74 | 170 | {24,36,48,60} |
| Traffic | 次/h | 862 | {24,48,168,336,720} |
Table 2
Comparison with transformer-based models"
| 数据集 | MSE | MAE | |||||||||||
| Informer | Autoformer | Pyraformer | FEDformer | Crossformer | TSUF | Informer | Autoformer | Pyraformer | FEDformer | Crossformer | TSUF | ||
| ETTh1 | 0.907 | 0.482 | 0.747 | 0.410 | 0.405 | 0.387 | 0.739 | 0.478 | 0.653 | 0.444 | 0.437 | 0.411 | |
| ETTm1 | 0.749 | 0.522 | 0.601 | 0.370 | 0.361 | 0.315 | 0.640 | 0.484 | 0.535 | 0.414 | 0.395 | 0.354 | |
| WTH | 0.574 | 0.516 | 0.456 | 0.489 | 0.432 | 0.475 | 0.552 | 0.509 | 0.479 | 0.501 | 0.461 | 0.484 | |
| ECL | 0.392 | 0.332 | 0.485 | 0.312 | 0.309 | 0.168 | 0.448 | 0.405 | 0.466 | 0.394 | 0.359 | 0.259 | |
| ILI | 4.878 | 3.116 | 4.591 | 2.795 | 3.387 | 1.861 | 1.513 | 1.228 | 1.460 | 1.156 | 1.236 | 0.924 | |
| Traffic | 0.690 | 0.618 | 0.634 | 0.597 | 0.525 | 0.383 | 0.384 | 0.390 | 0.348 | 0.380 | 0.294 | 0.263 | |
Table 3
Comparison with non-transformer-based models"
| 数据集 | MSE | MAE | |||||||||||
| LSTMa | LSTNet | MTGNN | DLinear | MICN | TSUF | LSTMa | LSTNet | MTGNN | DLinear | MICN | TSUF | ||
| ETTh1 | 1.193 | 1.909 | 0.568 | 0.403 | 0.504 | 0.387 | 0.896 | 1.165 | 0.538 | 0.424 | 0.497 | 0.411 | |
| ETTm1 | 1.566 | 1.981 | 0.433 | 0.318 | 0.346 | 0.315 | 1.032 | 1.789 | 0.447 | 0.355 | 0.386 | 0.354 | |
| WTH | 1.121 | 0.731 | 0.442 | 0.483 | 0.462 | 0.475 | 0.850 | 0.644 | 0.468 | 0.492 | 0.474 | 0.484 | |
| ECL | 1.043 | 0.469 | 0.326 | 0.232 | 0.170 | 0.168 | 0.838 | 0.512 | 0.367 | 0.315 | 0.279 | 0.259 | |
| ILI | 4.778 | 5.300 | 4.861 | 2.864 | 2.664 | 1.861 | 1.432 | 1.657 | 1.507 | 1.197 | 1.086 | 0.924 | |
| Traffic | 0.961 | 0.716 | 0.527 | 0.397 | 0.520 | 0.383 | 0.537 | 0.438 | 0.315 | 0.282 | 0.304 | 0.263 | |
| 1 | LIU C H, HOI S C H, ZHAO P L, et al. Online ARIMA algorithms for time series prediction [C]// Proceedings of the 30th AAAI Conference on Artificial Intelligence. AAAI, 2016: 1867-1873. |
| 2 | SHUMWAY R H, STOFFER D S. Time Series Analysis and Its Applications: With R Examples [M]. Berlin: Springer, 2017: 75-146. |
| 3 | CLEVELAND R B, CLEVELAND W S, MCRAE J E, et al.. STL: A seasonal-trend decomposition procedure based on loess. Journal of Office Statistics, 1990, 6 (1): 3- 73. |
| 4 | HAN Z Y, LIU Y, ZHAO J, et al.. Real time prediction for converter gas tank levels based on multi-output least square support vector regressor. Control Engineering Practice, 2012, 20 (12): 1400- 1409. |
| 5 | GIRARD A, RASMUSSEN C E, CANDELA J Q, et al. Gaussian process priors with uncertain inputs application to multiple-step ahead time series forecasting [C]// Proceedings of the 15th International Conference on Neural Information Processing Systems. Cambridge, MA, United States: MIT Press, 2002: 545-552. |
| 6 | HAN Z Y, ZHAO J, LEUNG H, et al.. A review of deep learning models for time series prediction. IEEE Sensors Journal, 2021, 21 (6): 7833- 7848. |
| 7 | VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY, United States: Curran Associates Inc., 2017: 6000-6010. |
| 8 | WEN Q S, ZHOU T, ZHANG C, et al. Transformers in time series: A survey [C]// Proceedings of the 32nd International Joint Conference on Artificial Intelligence. 2023: 6778-6786. |
| 9 | LI S Y, JIN X Y, XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of Transformer on time series forecasting [C]// Proceedings of the 33rd International Conference on Neural Information Processing Systems. Red Hook, NY, United States: Curran Associates Inc., 2019: 5243-5253. |
| 10 | WU S F, XIAO X, DING Q G, et al. Adversarial sparse transformer for time series forecasting [C]// Proceedings of the 34th International Conference on Neural Information Processing Systems. Red Hook, NY, United States: Curran Associates Inc., 2020: 17105-17115. |
| 11 | ZHOU H Y, ZHANG S H, PENG J Q, et al. Informer: Beyond efficient transformer for long sequence time-series forecasting [C]// Proceedings of the AAAI Conference on Artificial Intelligence, 35, No. 12: AAAI-21 Technical Tracks 12. 2021: 11106-11115. |
| 12 | WU H X, XU J H, WANG J M, et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting [C]// Proceedings of the 35th International Conference on Neural Information Processing Systems. Red Hook, NY, United States: Curran Associates Inc., 2021: 22419-22430. |
| 13 | LIU S Z, YU H, LIAO C, et al. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting [C]// International Conference on Learning Representations (ICLR 2022). 2022. https://openreview.net/pdf?id=0EXmFzUn5I. |
| 14 | ZHOU T, MA Z Q, WEN Q S, et al. FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting [C]// Proceedings of the 39th International Conference on Machine Learning. 2022: 27268-27286. |
| 15 | ZHANG Y H, YAN J C. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting [C]// The 11th International Conference on Learning Representations (ICLR 2023 Conference). 2023. https://openreview.net/forum?id=vSVLM2j9eie. |
| 16 | ZENG A L, CHEN M X, ZHANG L, et al. Are transformers effective for time series forecasting? [C]// Proceedings of the 37th AAAI Conference on Artificial Intelligence and 35th Conference on Innovative Applications of Artificial Intelligence and 13th Symposium on Educational Advances in Artificial Intelligence. AAAI, 2023: 11121-11128. |
| 17 | CHALLU C, OLIVARES K G, ORESHKIN B N, et al. N-HiTS: Neural hierarchical interpolation for time series forecasting [C]// Proceedings of the 37th AAAI Conference on Artificial Intelligence and 35th Conference on Innovative Applications of Artificial Intelligence and 13th Symposium on Educational Advances in Artificial Intelligence. AAAI, 2023: 6989-6997. |
| 18 | VIJAY E, JATI A, NGUYEN N, et al. TSMixer: Lightweight MLP-mixer model for multivariate time series forecasting [C]// Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. ACM, 2023: 459-469. |
| 19 | LAI G K, CHANG W C, YANG Y M, et al. Modeling long-and short-term temporal patterns with deep neural networks [C]// The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. ACM, 2018: 95-104. |
| 20 | WU Z H, PAN S R, LONG G D, et al. Connecting the dots: multivariate time series forecasting with graph neural networks [C]// Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, 2020: 753-763. |
| 21 | LIU M H, ZENG A L, CHEN M X, et al. SCINet: Time series modeling and forecasting with sample convolution and interaction [C]// Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS 2022). 2022. https://openreview.net/pdf?id=AyajSjTAzmg. |
| 22 | VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks [C]// Conference paper at ICLR (International Conference on Learning Representations) 2018. 2018. https://openreview.net/pdf?id=rJXMpikCZ. |
| 23 | KIUREGHIAN A D, DITLEVSEN O.. Aleatory or epistemic? Does it matter?. Structural Safety, 2009, 31 (2): 105- 112. |
| 24 | LIU J Z, PADHY S, REN J, et al.. A simple approach to improve single-model deep uncertainty via distance-awareness. Journal of Machine Learning Research, 2023, 24 (42): 1667- 1729. |
| 25 | BLUNDELL C, CORNEBISE J, KAVUKCUOGLU K, et al. Weight uncertainty in neural network [C]// Proceedings of the 32nd International Conference on Machine Learning- Volume 37. 2015: 1613-1622. |
| 26 | GAL Y, GHAHRAMANI Z. Dropout as a Bayesian approximation: Representing model uncertainty in deep learning [C]// Proceedings of the 33rd International Conference on Machine Learning. 2016: 1050-1059. |
| 27 | KENDALL A, GAL Y. What uncertainties do we need in Bayesian deep learning for computer vision? [C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY, United States: Curran Associates Inc., 2017: 5580-5590. |
| 28 | LAKSHMINARAYANAN B, PRITZEL A, BLUNDELL C. Simple and scalable predictive uncertainty estimation using deep ensembles [C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY, United States: Curran Associates Inc., 2017: 6405-6416. |
| 29 | PENG Z L, GUO Z H, HUANG W, et al.. Conformer: Local features coupling global representations for recognition and detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45 (8): 9454- 9468. |
| 30 | KRIZHEVSKY A, SUTSKEVER I, HINTON G E. ImageNet classification with deep convolutional neural networks [J]. Communications of the ACM, 2017, 60(6): 94-90. |
| 31 | DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16x16 words: Transformers for image recognition at scale [C]// Conference Paper at ICLR (International Conference on Learning Representations) 2021. 2021. https://openreview.net/forum?id=YicbFdNTTy. |
| 32 | DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding [C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). ACM, 2019: 4171-4186. |
| 33 | GUO C, PLEISS G, SUN Y, et al. On calibration of modern neural networks [C]// Proceedings of the 34th International Conference on Machine Learning - Volume 70. 2017: 1321-1330. |
| 34 | KULESHOV V, FENNER N, ERMON S. Accurate uncertainties for deep learning using calibrated regression [C]// Proceedings of the 35th International Conference on Machine Learning. 2018: 2796-2804. |
| 35 | BAHDANAU D, CHO K H, BENGIO Y. Neural machine translation by jointly learning to align and translate [C]// Conference Paper at ICLR (International Conference on Learning Representations) 2015. 2015. https://iclr.cc/archive/www/lib/exe/fetch.php%3Fmedia=iclr2015:bahdanau-iclr2015.pdf. |
| 36 | WANG H Q, PENG J, HUANG F H, et al. MICN: Multi-scale local and global context modeling for long-term series forecasting [C]// Conference Paper at ICLR (International Conference on Learning Representations) 2023. 2023. https://openreview.net/pdf?id=zt53IDUR1U. |
| [1] | Wenjing HU, Longquan JIANG, Junlong YU, Yiqian XU, Qipeng LIU, Lei LIANG, Jiahao LI. Knowledge-distillation-based lightweight crop-disease-recognition algorithm [J]. J* E* C* N* U* N* S*, 2025, 2025(1): 59-71. |
| [2] | Yu ZHAO, Jinming LIU. Linear entropy uncertainty relation of Ising model under Dzyaloshinskii-Moriya interaction [J]. Journal of East China Normal University(Natural Science), 2024, 2024(3): 147-155. |
| [3] | Luping FENG, Liye SHI, Wen WU, Jun ZHENG, Wenxin HU, Wei ZHENG. Collaborative stranger review-based recommendation [J]. Journal of East China Normal University(Natural Science), 2024, 2024(2): 53-64. |
| [4] | Lulu JIANG, Siqi SUN, Haidong ZOU, Lina LU, Rui FENG. Diabetic retinopathy grading based on dual-view image feature fusion [J]. Journal of East China Normal University(Natural Science), 2023, 2023(6): 39-48. |
| [5] | Daojia CHEN, Zhiyun CHEN. Hierarchical description-aware personalized recommendation system [J]. Journal of East China Normal University(Natural Science), 2023, 2023(6): 73-84. |
| [6] | Zhishang DUAN, Yi RAN, Duliang LYU, Jie QI, Jiachen ZHONG, Peisen YUAN. Identifying electricity theft based on residual network and depthwise separable convolution enhanced self attention [J]. Journal of East China Normal University(Natural Science), 2023, 2023(5): 193-204. |
| [7] | Jin GE, Xuesong LU. Automatic generation of Web front-end code based on UI images [J]. Journal of East China Normal University(Natural Science), 2023, 2023(5): 100-109. |
| [8] | Wei DENG, Fang ZHOU. Multimodal-based prediction model for acute kidney injury [J]. Journal of East China Normal University(Natural Science), 2023, 2023(4): 52-64. |
| [9] | Shuai ZHANG, Huiqi HU, Yaoqiang XU, Xuan ZHOU. Time series database query optimization for anomaly detection [J]. Journal of East China Normal University(Natural Science), 2023, 2023(2): 119-131. |
| [10] | Zhaoyang WU, Jiali MAO. Research on travel time prediction based on neural network [J]. Journal of East China Normal University(Natural Science), 2023, 2023(2): 106-118. |
| [11] | Mengchen YANG, Xudong CHEN, Peng CAI, Lyu NI. Survey of early time series classification methods [J]. Journal of East China Normal University(Natural Science), 2021, 2021(5): 115-133. |
| [12] | Xiaoqin MA, Xiaohui XUE, Hongjiao LUO, Tongyu LIU, Peisen YUAN. Electricity theft detection based on t-LeNet and time series classification [J]. Journal of East China Normal University(Natural Science), 2021, 2021(5): 104-114. |
| [13] | Yanchun LIANG, Ailian FANG. Chinese text relation extraction based on a multi-channel convolutional neural network [J]. Journal of East China Normal University(Natural Science), 2021, 2021(3): 96-104. |
| [14] | LI Xiaochang, CHEN Bei, DONG Qiwen, LU Xuesong. Discovering traveling companions using autoencoders [J]. Journal of East China Normal University(Natural Science), 2020, 2020(5): 179-188. |
| [15] | XU Yiwen, LI Xiaoyang, DONG Qiwen, QIAN Weining, ZHOU Fang. Merchant churn prediction based on transaction data of aggregate payment platform [J]. Journal of East China Normal University(Natural Science), 2020, 2020(5): 167-178. |
| Viewed | ||||||
|
Full text |
|
|||||
|
Abstract |
|
|||||