37. savetovanje CIGRE Srbija (2025) SIGURNOST, STABILNOST, POUZDANOST I RESILIENCE ELEKTROENERGETSKOG SISTEMA MULTISEKTORSKO POVEZIVANJE U ENERGETICI I PRIVREDI – D2-04
AUTOR(I) / AUTHOR(S): Matija Rogić, Mileta Žarković
DOI: 10.46793/CIGRE37.D2.04
SAŽETAK / ABSTRACT:
Variable ambient conditions have a dominant influence on electricity consumption. Renewable energy sources are increasingly present in electricity generation, and the weather conditions on which they depend are prone to sudden changes and can be predicted with a certain degree of accuracy.
Since both generation and consumption depend on a large number of factors, it is desirable to use a neural network from the domain of artificial intelligence for forecasting so that the contributions of all these factors are included. Different neural network architectures have proven successful in forecasting, each emphasizing different qualities. The application of multilayer perceptron (MLP) neural networks is a good option due to its simplicity. Recurrent neural networks such as Long Short-Term Memory (LSTM) are a natural tool for predicting time-series data. Convolutional networks and autoencoders have an advantage over other architectures due to their efficiency. Transformers, thanks to the attention mechanism and their large capacity, can process and derive insights from vast amounts of data. It is also possible to use hybrid architectures composed of various building blocks.
This paper presents the application of the TiDE (Time-series Dense Encoder) neural network for forecasting electricity generation and consumption in Serbia based on real data from the first half of 2019. Additionally, meteorological data from the Republic Hydrometeorological Service of Serbia (RHMZ) for the same period were used. The forecast was carried out seven days in advance with an hourly resolution. The paper describes the algorithm for applying the TiDE neural network and the resulting errors in the time-series forecasts.
KLJUČNE REČI / KEYWORDS:
power system, forecasting, generation, neural networks, autoencoder
PROJEKAT / ACKNOWLEDGEMENT:
LITERATURA / REFERENCES:
- Abhimanyu Das, Weihao Kong, Andrew Leach, Shaan Mathur, Rajat Sen, Rose Yu (2023) Long-term Forecasting with TiDE: Time-series Dense Encoder. URL https://arxiv.org/abs/2304.08424
- Taesung Kim, Jinhee Kim, Yunwon Tae, Cheonbok Park, Jang-Ho Choi, Jaegul Choo (2022) Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift. URL https://openreview.net/pdf?id=cGDAkQo1C0p
- Haoyi Zhou, Shanghang Zhang, Jieqi Peng, Shuai Zhang, Jianxin Li, Hui Xiong, and Wancai Zhang. (2021) Informer: Beyond efficient transformer for long sequence time series forecasting. In Proceedings of the AAAI conference on artificial intelligence. URL https://arxiv.org/abs/2012.07436
- Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, and Rong Jin. (2022) Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International Conference on Machine Learning, pages 27268–27286. PMLR. URL https://arxiv.org/abs/2201.12740
- Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. (2021) Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems, 34: 22419–22430. URL https://arxiv.org/abs/2106.13008
- Yuqi Nie, Nam H Nguyen, Phanwadee Sinthong, and Jayant Kalagnanam. (2022) A time series is worth 64 words: Long-term forecasting with transformers. International conference on learning representations. URL https://arxiv.org/abs/2211.14730
- Bryan Lim, Sercan O. Arik, Nicolas Loeff, Tomas Pfister (2019) Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting. URL https://arxiv.org/abs/1912.09363
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, Illia Polosukhin. (2017) Attention is all you need. Advances in neural information processing systems, 30. URL https://arxiv.org/abs/1706.03762
- David Salinas, Valentin Flunkert, Jan Gasthaus (2019) DeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks. URL https://arxiv.org/pdf/1704.04110
- Ailing Zeng, Muxi Chen, Lei Zhang, Qiang Xu (2020) Are Transformers Effective for Time Series Forecasting? URL https://arxiv.org/abs/2205.13504
- Boris N. Oreshkin, Dmitri Carpov, Nicolas Chapados, Yoshua Bengio (2020) N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. URL https://arxiv.org/abs/1905.10437
- Cristian Challu, Kin G. Olivares, Boris N. Oreshkin, Federico Garza, Max Mergenthaler Canseco, Artur Dubrawski (2022) N-HiTS: Neural Hierarchical Interpolation for Time Series Forecasting. URL https://arxiv.org/abs/2201.12886
- Si-An Chen, Chun-Liang Li, Nate Yoder, Sercan O. Arik, Tomas Pfister (2023) TSMixer: An All-MLP Architecture for Time Series Forecasting. URL https://arxiv.org/abs/2303.06053
- Julien Herzen, Francesco Lässig, Samuele Giuliano Piazzetta, Thomas Neuer, Léo Tafti, Guillaume Raille, Tomas Van Pottelbergh, Marek Pasieka, Andrzej Skrodzki, Nicolas Huguenin, Maxime Dumonal, Jan Kościsz, Dennis Bader, Frédérick Gusset, Mounir Benheddi, Camila Williamson, Michal Kosinski, Matej Petrik, Gaël Grosch (2022) Darts: User-Friendly Modern Machine Learning for Time Series. URL https://jmlr.org/papers/v23/21-1177.htm
- Olaf Ronneberger, Philipp Fischer, Thomas Brox (2015) U-Net: Convolutional Networks for Biomedical Image Segmentation. URL https://arxiv.org/abs/1505.04597