Multi-Granularity Autoformer for long-term deterministic and probabilistic power load forecasting

Yang Yang, Yuchao Gao, Hu Zhou, Jinran Wu, Shangce Gao, You Gan Wang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Long-term power load forecasting is critical for power system planning but is constrained by intricate temporal patterns. Transformer-based models emphasize modeling long- and short-term dependencies yet encounter limitations from complexity and parameter overhead. This paper introduces a novel Multi-Granularity Autoformer (MG-Autoformer) for long-term load forecasting. The model leverages a Multi-Granularity Auto-Correlation Attention Mechanism (MG-ACAM) to effectively capture fine-grained and coarse-grained temporal dependencies, enabling accurate modeling of short-term fluctuations and long-term trends. To enhance efficiency, a shared query–key (Q–K) mechanism is utilized to identify key temporal patterns across multiple resolutions and reduce model complexity. To address uncertainty in power load forecasting, the model incorporates a quantile loss function, enabling probabilistic predictions while quantifying uncertainty. Extensive experiments on benchmark datasets from Portugal, Australia, America, and ISO New England demonstrate the superior performance of the proposed MG-Autoformer in long-term power load point and probabilistic forecasting tasks.

Original languageEnglish
Article number107493
JournalNeural Networks
Volume188
DOIs
StatePublished - 2025/08

Keywords

  • Autoformer
  • Long-term forecasting
  • Probabilistic forecasting
  • Self-attention mechanism

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Multi-Granularity Autoformer for long-term deterministic and probabilistic power load forecasting'. Together they form a unique fingerprint.

Cite this