Skip to main content

Table 15 Performance comparison of the suggested models with the models from prior studies

From: Wind speed prediction for site selection and reliable operation of wind power plants in coastal regions using machine learning algorithm variants

Ref

Year

Region/country

Data resolution

Methods

Best performer

Performance

Shawon et al. (2021)

2021

NA

Hourly

ARMA, ARIMA, SVR, and ANN

Polynomial SVR

RMSE = 0.552

MAPE = 5%

Mohsin et al. (2021)

2021

NA

3− h interval

BNN, and Lasso

BNN

MAPE = 19.01%

NMAE = 0.003

Hanoon et al. (2022)

2022

14 regions in Malaysia

Daily

GPR, SVR, and BTs

GPR

RMSE = 0.18144

MSE = 0.03292

NSE = 0.26957

MAE = 0.13498

R2 = 0.38115

S. Kumar P (2019)

2019

Waterloo, Canada

15-min interval

BPN, BPN with MIFS, RBF, RBF with MIFS, NARX, and NARX with MIFS

NARX with MIFS

RMSE = 0.5814

MAE = 0.4381

Elsaraiti & Merabet (2021)

2021

Halifax, Canada

Hourly

ARIMA, and LSTM

LSTM

RMSE = 3.124

MAE = 2.457

Liu & Chen (2019)

2022

East Jerusalem, Palestine

3-h interval

MLR, ridge, lasso, RF, SVR, and LSTM

RF

MAE = 0.894

MSE = 1.345

MAD = 0.715

R2 = 0.435

Xie et al. (2021)

2021

Yanqing, and Zhaitan, Beijing, China

Hourly

ARMA, single-variable LSTM, and MV-LSTM

MV-LSTM

RMSE = 1.1460

MAE = 0.8468

MBE = 0.0276

MAPE = 0.6412

Malakout (2023)

2023

Turkey

Monthly

LightGBM, GBR, AdaBoost, Elastic net, lasso, and ensemble method (LightGBM and AdaBoost)

Ensemble method

RMSE = 0.2080

MAE = 0.1410

MAPE = 0.0292

R2 = 0.997

Krishnaveni et al. (2021)

2021

Las Vegas, USA

Hourly

MLR, Lasso, SVR, and MPFFNN

SVR

MSE = 0.011217

MAE = 0.080115

This study

2023

Kutubdia and Cox's Bazar, Bangladesh

3-h interval

MLR, Ridge, Lasso, Elastic Net, KNN, DT, RF, GBR, AdaBoost, XGBoost, LightGBM, LSTM and GRU

CatBoost

MSE = 0.3744

MAE = 0.4415

R2 = 0.8670

  1. NREL: National Renewable Energy Laboratory; MLR: multiple linear regression; LR: linear regression; SVR: support vector regression; ARMA: autoregressive moving average; ARIMA: autoregressive integrated moving average; ANN: artificial neural network; GPR: Gaussian progress regression; BTs: bagged regression trees; BNN: Bayesian neural network; BPN: back propagation network; NARX: nonlinear autoregressive model process with exogenous inputs; MIFS: mutual Information feature selection; MPFFNN: multiple perceptron feed-forward neural network; RBF: radial basis function; RF: random forest; LSTM: long short-term memory; MV-LSTM: multivariate long short-term memory; GB: gradient boosting; XGBoost: extreme gradient boosting; CatBoost: category boosting; AdaBoost: adaptive boosting; KNN: K-nearest neighbors; DTR: decision tree regressor; SVM: Support vector machine; RVM: Relevance vector machine; LightGBM: Light gradient boosting machine; R2: coefficient of determination; RMSE: root mean square error; MAE: mean absolute error; MAD: mean absolute deviation; MAPE: mean absolute percentage error; MBE: mean bias error; MRE: mean relative error; NMAE: normalized mean absolute error; NSE: Nash–Sutcliffe efficiency