Hyperparameters optimization of support vector regression using black hole algorithm

Document Type : Research Paper

Authors

1 Department of Operations Research and Intelligent Technologies, University of Mosul, Mosul, Iraq

2 Department of Statistics and Informatics, University of Mosul, Mosul, Iraq

Abstract

The support vector regression (SVR) technique is considered the most promising and widespread way in the prediction process, and raising the predictive power of this technique and increasing its generalization ability well depends on tunning its hyperparameters. Nature-inspired algorithms are an important and effective tool in optimizing or tuning hyperparameters for SVR models. In this research, one of the algorithms inspired by nature, the black hole algorithm (BHA), by adapting this algorithm to optimize the hyperparameters of SVR, the experimental results, obtained from working on two data sets,  showed, the proposed algorithm works better by finding a combination of hyperparameters as compared to the grid search (GS) algorithm, in terms of prediction and running time. In addition, the experimental results show the improvement of the prediction and computational time of the proposed algorithm. This demonstrates BHA's ability to find the best combination of hyperparameters.

Keywords

[1] A.A. Akinpelu, M.E. Ali, T.O. Owolabi, M.R. Johan, R. Saidur, S.Q. Olatunji and Z. Chowdbury, A support vector regression model for the prediction of total polyaromatic hydrocarbons in soil: An artificial intelligent system for mapping environmental pollution, Neural Comput. Appl. 32(18) (2020) 14899–14908.
[2] N.A. Al-Thanoon, Q.S. Qasim and Z.Y. Algamal, Selection of tuning parameter in L1-support vector machine via. particle swarm optimization method, J. Engin. Appl. Sci. 15(1) (2019) 310–318.
[3] Z. Qasim, M. Lee and H. Ali, Improving grasshopper optimization algorithm for hyperparameters estimation and feature selection in support vector regression, Chem. Intell. Lab. Syst. 208 (2021) 104196.
[4] H. Aslani, M. Yaghoobi and M. R. Akbarzadeh, Chaotic inertia weight in black hole algorithm for function optimization, Int. Cong. Technol. Commun. Knowledge 2015.
[5] K.Y. Chen and C.H. Wang, Support vector regression with genetic algorithms in forecasting tourism demand, Tourism Manag. 28(1) (2007) 215–226.
[6] V. Cherkasskyand and Y. Ma, Selection of Meta-parameters for Support Vector Regression, Artif. Neural Networks, Springer, Berlin, Heidelberg, 2002, pp. 687–693.
[7] V. Cherkassky and Y. Ma, Practical selection of SVM parameters and noise estimation for SVM regression, Neural Networks 17(1) (2004) 113–126.
[8] J.S. Chou and A.D. Pham, Nature-inspired metaheuristic optimization in least squares support vector regression for obtaining bridge scour information, Inf. Sci. 399 (2017) 64–80.
[9] C.C. Chuang and Z.J. Lee, Hybrid robust support vector machines for regression with outliers, Appl. Soft Comput. 11(1) (2011) 64–72.
[10] M. Eskandarzadehalamdary, B. Masoumi and O. Sojodishijani, A new hybrid algorithm based on black hole optimization and bisecting k-means for cluster analysis, 22nd Iran. Conf. Electric. Engin. 2014, pp. 1075–1079.
[11] M. Farahmandian and A. Hatamlou, Solving optimization problems using black hole algorithm, J. Adv. Comput. Sci. Technol. 4(1) (2015) 68.
[12] R. Fernandez, Predicting time series with a local support vector regression machine, ACAI 99, 1999.
[13] W. Gao, X. Wang, S. Dai and D. Chen, Study on stability of high embankment slope based on black hole algorithm, Envir. Earth Sci. 75(20) (2016) 1381.
[14] A. Hatamlou, Black hole: A new heuristic optimization approach for data clustering, Inf. Sci. 222 (2013) 175–184.
[15] H. Kaneko and K. Funatsu, Fast optimization of hyperparameters for support vector regression models with highly predictive ability, Chemometrics and Intelligent Laboratory Systems, 142 (2015) 64-69.
[16] R. Laref, E. Losson, A. Sava and M. Siadat, On the optimization of the support vector machine regression hyperparameters setting for gas sensors array applications, Chem. Intell. Lab. Syst. 184 (2019) 22–27.
[17] S. Fang and X. Liu, Parameter optimization of support vector regression based on sine cosine algorithm, Expert Syst. Appl. 91 (2018) 63–77.
[18] M. Nait Amar and N. Zeraibi, Application of hybrid support vector regression artificial bee colony for prediction of MMP in CO2-EOR process, Petroleum, 6(4) (2018) 415–422.
[19] E. Pashaei, M. Ozen and N. Aydin, An application of black hole algorithm and decision tree for medical problem, IEEE 15th Int. Conf. Bioinf. Bioengin. 2015, pp. 1–6.
[20] O.S. Qasim, K.A. Abed and A.F. Qasim, Optimal parameters for nonlinear Hirota-Satsuma coupled KdV system by using hybrid firefly algorithm with modified Adomian decomposition, J. Math. Fund. Sci. 52(3) 2020 339–352.
[21] B. Sch¨olkopf, A.J. Smola, R.C. Williamson and P.L. Bartlett, New support vector algorithms, Neural Comput. 12(5) (2000) 1207–1245.
[22] A. Smola, N. Murata, B. Sch¨olkopf and K.R. M¨uller, Asymptotically optimal choice of ε-loss for support vector machines, Int. Conf. Artif. Neural Networks, 1998, pp. 105–110.
[23] D.D. Warnana, Black hole algorithm for determining model parameter in self-potential data, J. Appl. Geophys. 148 (2018) 189–200.
[24] B. Ustun, M.J. Melssen, M. Oudenhuijzen and L.M. Buydens, Determination of optimal support vector regression parameters by genetic algorithms and simplex optimization, Anal. Chim. Acta 544(1-2) (2005) 292–305.
[25] V. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, 1995.
[26] V. Vapnik, S.E. Golowich and A. Smola, Support vector method for function approximation, regression estimation, and signal processing, Adv. Neural Inf. Process. Syst. (1997) 281–287.
[27] S. Xu, B. Lu, M. Baldea, T.F. Edgar and M. Nixon, An improved variable selection method for support vector regression in NIR spectral modeling, J. Process Cont. 67 (2018) 83–93.
[28] Y. P. Zhao and J. G. Sun, Robust truncated support vector regression, Expert Syst. Appl. 37(7) (2010) 5126–5133.
Volume 13, Issue 1
March 2022
Pages 3441-3450
  • Receive Date: 12 November 2021
  • Revise Date: 20 December 2021
  • Accept Date: 08 January 2022