抄録
Long-term power load forecasting is critical for power system planning but is constrained by intricate temporal patterns. Transformer-based models emphasize modeling long- and short-term dependencies yet encounter limitations from complexity and parameter overhead. This paper introduces a novel Multi-Granularity Autoformer (MG-Autoformer) for long-term load forecasting. The model leverages a Multi-Granularity Auto-Correlation Attention Mechanism (MG-ACAM) to effectively capture fine-grained and coarse-grained temporal dependencies, enabling accurate modeling of short-term fluctuations and long-term trends. To enhance efficiency, a shared query–key (Q–K) mechanism is utilized to identify key temporal patterns across multiple resolutions and reduce model complexity. To address uncertainty in power load forecasting, the model incorporates a quantile loss function, enabling probabilistic predictions while quantifying uncertainty. Extensive experiments on benchmark datasets from Portugal, Australia, America, and ISO New England demonstrate the superior performance of the proposed MG-Autoformer in long-term power load point and probabilistic forecasting tasks.
本文言語 | 英語 |
---|---|
論文番号 | 107493 |
ジャーナル | Neural Networks |
巻 | 188 |
DOI | |
出版ステータス | 出版済み - 2025/08 |
ASJC Scopus 主題領域
- 認知神経科学
- 人工知能