Projects per year
Abstract
Short term power load forecasting plays an important role in the management and development of power systems with a focus on the reduction in power wastes and economic losses. In this paper, we construct a novel, short-term power load forecasting method by improving the bidirectional long short-term memory (Bi-LSTM) model with Extreme Gradient Boosting (XGBoost) and Attention mechanism. Our model differs from existing methods in the following three aspects. First, we use the weighted grey relational projection algorithm to distinguish the holidays and non-holidays in the data preprocessing. Secondly, we add the Attention mechanism to the Bi-LSTM model to improve the validity and accuracy of prediction. Thirdly, XGBoost is a newly-developed, well-performing prediction model, which is used together with the Attention mechanism to optimize the Bi-LSTM model. Therefore, we develop a novel, combined power load prediction model “Attention-Bi-LSTM + XGBoost” with the weight determination theory-error reciprocal method. Using two power market datasets, we evaluate our prediction method by comparing it with two benchmark models and four other models. With our prediction method, the MAPE, MAE, and RMSE for the Singapore’s power market are 0.387, 43.206, and 54.357, respectively; and those for the Norway’s power market are 0.682, 96.278, and 125.343, respectively. The test results are smaller than the results for six other models. This indicates that our prediction method outperforms the LSTM, Bi-LSTM, Attention-RNN, Attention-LSTM, Attention-Bi-LSTM, and XGBoost in effectiveness, accuracy, and practicability.
Original language | English |
---|---|
Article number | 109632 |
Journal | Applied Soft Computing |
Volume | 130 |
Early online date | 10 Oct 2022 |
DOIs | |
Publication status | Published - 1 Nov 2022 |
Bibliographical note
Publisher Copyright:© 2022 Elsevier B.V.
Funding
This work is supported by the National Natural Science Foundation of China (No. 72171126), Ministry of Education Project of Humanities and Social Science, China (No. 20YJA630009), and Social Science Planning Project of Shandong Province, China (No. 20CSDJ15). This work is also financially supported by the Faculty Research Grant (FRG) of Lingnan University, China (No. DB21B1).
Keywords
- Attention mechanism
- Bidirectional long-short long–short term memory network
- Extreme gradient boosting
- Power load forecasting
- Weighted grey relational projection algorithm
Fingerprint
Dive into the research topics of 'Improving the Bi-LSTM model with XGBoost and attention mechanism: A combined approach for short-term power load prediction'. Together they form a unique fingerprint.Projects
- 1 Finished
-
A New Hybrid Forecasting Approach and Its Application in the Power Load Prediction
LENG, M. (PI)
1/06/21 → 31/05/23
Project: Grant Research