Multi-Granularity Autoformer for long-term deterministic and probabilistic power load forecasting

Yang Yang, Yuchao Gao, Hu Zhou, Jinran Wu, Shangce Gao, You Gan Wang*

*この論文の責任著者

研究成果: ジャーナルへの寄稿学術論文査読

抄録

Long-term power load forecasting is critical for power system planning but is constrained by intricate temporal patterns. Transformer-based models emphasize modeling long- and short-term dependencies yet encounter limitations from complexity and parameter overhead. This paper introduces a novel Multi-Granularity Autoformer (MG-Autoformer) for long-term load forecasting. The model leverages a Multi-Granularity Auto-Correlation Attention Mechanism (MG-ACAM) to effectively capture fine-grained and coarse-grained temporal dependencies, enabling accurate modeling of short-term fluctuations and long-term trends. To enhance efficiency, a shared query–key (Q–K) mechanism is utilized to identify key temporal patterns across multiple resolutions and reduce model complexity. To address uncertainty in power load forecasting, the model incorporates a quantile loss function, enabling probabilistic predictions while quantifying uncertainty. Extensive experiments on benchmark datasets from Portugal, Australia, America, and ISO New England demonstrate the superior performance of the proposed MG-Autoformer in long-term power load point and probabilistic forecasting tasks.

本文言語英語
論文番号107493
ジャーナルNeural Networks
188
DOI
出版ステータス出版済み - 2025/08

ASJC Scopus 主題領域

  • 認知神経科学
  • 人工知能

フィンガープリント

「Multi-Granularity Autoformer for long-term deterministic and probabilistic power load forecasting」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル