[1] P. Ahmadi, H. Asaadian, S. Kord, and A. Khadivi, Investigation of the simultaneous chemicals influences to promote oil-in-water emulsions stability during enhanced oil recovery applications, J. Molecul. Liquids 275 (2019), 57–70.
[2] H.S. Al-Hadhrami and M.J. Blunt, Thermally induced wettability alteration to improve oil recovery in fractured reservoirs, SPE Reserv. Eval. Engin. 4 (2001), no. 3, 179–186.
[3] E.W. Al-Shalabi and K. Sepehrnoori, A comprehensive review of low salinity/engineered water injections and their applications in sandstone and carbonate rocks, J. Petrol. Sci. Engin. 139 (2016), 137–161.
[4] M.B. Alotaibi and H.A. Nasr-El-Din, Chemistry of injection water and its impact on oil recovery in carbonate and clastics formations, SPE Int. Conf. Oilfield Chem., SPE, 2009, pp. 121565.
[5] N.S. Altman, An introduction to kernel and nearest-neighbor nonparametric regression, Amer. Statist. 46 (1992), no. 3, 175–185.
[6] T. Austad and D.C. Standnes, Spontaneous imbibition of water into oil-wet carbonates, J. Petrol. Sci. Engin. 39 (2003), 363–376.
[7] E. Bahonar, Y. Ghalenoei, M. Chahardowli, M. Simjoo, New correlations to predict oil viscosity using data mining techniques, J. Petrol. Sci. Engin. 208 (2022), 109736.
[8] M. Belgiu and L. DrĖagu, Random forest in remote sensing: A review of applications and future directions, ISPRS J. Photogram. Remote Sens. 114 (2016), 24–31.
[9] G. Biau, Analysis of a random forests model, J. Mach. Learn. Res. 13 (2012), 1063–1095.
[10] G. Biau and E. Scornet, A random forest guided tour, Test 25 (2016), 197–227.
[11] B. Boehmke and B.M. Greenwell, Hands-on Machine Learning with R, Chapman and Hall/CRC, 2019.
[12] H. Borchani, G. Varando, C. Bielza, and P. Larra˜naga, A survey on multi-output regression, Wiley Interdiscip. Rev.: Data Min. Knowledge Discov. 5 (2015), no. 5, 216–233.
[13] A.L. Boulesteix, S. Janitza, J. Kruppa, and I.R. Konig, Overview of random forest methodology and practical guidance with emphasis on computational biology and bioinformatics, Wiley Interdiscip. Rev.: Data Min. Knowledge Disc. 2 (2012), no. 6, 493–507.
[14] L. Breiman, Random forests, Machine Learn. 45 (2001), 5–32.
[15] L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone, Classification and Regression Trees, 1st Edition, Chap[1]man and Hall/CRC, 2017.
[16] M.Y. Chen, Predicting corporate financial distress based on integration of decision tree classification and logistic regression, Expert Syst. Appl. 38 (2011), 11261–11272.
[17] X. Chen and h. Ishwaran, Random forests for genomic data analysis, Genomics 99 (2012), 323–329.
[18] G.V. Chilingar and T.F. Yen, Some notes on wettability and relative permeabilities of carbonate reservoir rocks, II, Energy Sources 7 (1983), 67–75.
[19] H.E. Copeland, K.E. Doherty, D.E. Naugle, A. Pocewicz, and J.M. Kiesecker, Mapping oil and gas development potential in the US intermountain west and estimating impacts to species, PLoS ONE 4 (2009), 251–257.
[20] A. Criminisi, Decision forests: A unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning, Found. Trends Comput. Graph. Vis. 7 (2011), 81–227.
[21] A. Criminisi, J Shotton, and E. Konukoglu, Decision forests: A unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning, Found. Trends® Comput. Graph. Vis. 7 (2012), no. 2–3, 81–227.
[22] C. Dang, L. Nghiem, E. Fedutenko, E. Gorucu, C. Yang, A. Mirzabozorg, N. Nguyen, and Z. Chen, AI based mechanistic modeling and probabilistic forecasting of hybrid low salinity chemical flooding, Fuel 261 (2020), 116445.
[23] B. Efron and T. Hastie, Computer Age Statistical Inference: Algorithms, Evidence, and Data Science, Cambridge University Press, 2016.
[24] S.J. Fathi, T. Austad, and S. Strand, Smart water as a wettability modifier in chalk: the effect of salinity and ionic composition, Energy Fuels 24 (2010), 2514–2519.
[25] S.J. Fathi, T. Austad, and S. Strand, Water-based enhanced oil recovery (EOR) by smart water: Optimal ionic composition for EOR in carbonates, Energy Fuels 25 (2011), 5173–5179.
[26] J. Gareth, W. Daniela, H. Trevor, and T. Robert, An Introduction to Statistical Learning, Springer, New York, 2013.
[27] J. Ghosn, and Y. Bengio, Multi-Task Learning for Stock Selection, Adv. Neural Inf. Process. Syst. 9 (1996), 946–952.
[28] J. Gillberg, P. Marttinen, M. Pirinen, A.J. Kangas, P. Soininen, M. Ali, A.S. Havulinna, M.-R. J¨arvelin, M. Ala-Korpela, and S. Kaski, Multiple output regression with latent noise, J. Machine Learn. Res. 17 (2016), no. 122, 1–35.
[29] P.O. Gislason, J.A. Benediktsson, and J.R. Sveinsson, Random forest classification of multisource remote sensing and geographic data, IEEE Int. Geosci. Remote Sens. Symp., 2004, pp. 1049–1052.
[30] P.O. Gislason, J.A. Benediktsson, and J.R. Sveinsson, Random forests for land cover classification, Pattern Recogn. Lett. 27 (2006), 294–300.
[31] E. Goel and E. Abhilasha, Random forest: A review, Int. J. Adv. Res. Comput. Sci. Software Engin. 7 (2017), 251–257.
[32] P. Hall, B.U. Park, and R.J. Samworth, Choice of neighbor order in nearest-neighbor classification, Ann. Statist. 36 (2008), 2135–2152.
[33] L.D. Hallenbeck, J.E. Sylte, D.J. Ebbs, and L.K. Thomas, Implementation of the Ekofisk field waterflood, SPE Form. Eval. 6 (1991), 284–290.
[34] T. Hastie and R. Tibshirani, Discriminant adaptive nearest neighbor classification and regression, Adv. Neural Inf. Process. Syst. 8 (1995), 409–415.
[35] G. Hirasaki and D.L. Zhang, Surface chemistry of oil recovery from fractured, oil-wet, carbonate formations, SPE J. 9 (2004), 151–162.
[36] P.A. Hopkins, I. Omland, F. Layti, S. Strand, T. Puntervold, and T. Austad, Crude oil quantity and its effect on chalk surface wetting, Energy Fuels 31 (2017), 4663–4669.
[37] G.F. Hughes, On the mean accuracy of statistical pattern recognizers, IEEE Trans. Inf. Theory 14 (1968), 55–63.
[38] Z. Ibrahim and D. Rusli Predicting students’ academic performance: Comparing artificial neural network, decision tree and linear regression, 21st Ann. SAS Malaysia Forum, 2007, pp.1–6.
[39] S.B. Imandoust and M. Bolandraftar, Application of k-nearest neighbor (kNN) approach for predicting economic events: Theoretical background, J. Engin. Res. Appl. 3 (2013), no. 5, 605–610.
[40] J.H. Jeong, J.P. Resop, N.D. Mueller, D.H. Fleisher, K. Yun, E.E. Butler, D.J. Timlin, K.M. Shim, J.S. Gerber, V.R. Reddy, and S.H. Kim, Random forests for global and regional crop yield predictions, PLoS ONE 11 (2016), 1–15.
[41] R.J. Lewis, An introduction to classification and regression tree (CART) analysis, Ann. Meet. Soc. Acad. Emergency Med. San Francisco, California, Vol. 14. San Francisco, CA, USA: Department of Emergency Medicine Harbor-UCLA Medical Center Torrance, 2000.
[42] C.D. Manning and H. Schutze Foundations of Statistical Natural Language Processing, MIT Press, 1999.
[43] G. Melki, and A. Cano, V. Kecman, and S. Ventura, Multi-target support vector regression via correlation regressor chains, Inf. Sci. 415 (2017), 53–69.
[44] S. Mohammadi, S. Kord, and J. Moghadasi, An experimental investigation into the spontaneous imbibition of surfactant assisted low salinity water in carbonate rocks, Fuel 243 (2019), 142–154.
[45] N.R. Morrow, Wettability and its effect on oil recovery, J. Petrol. Technol. 42 (1990), no. 12, 1476–1484.
[46] C. Nguyen, Y. Wang, and H.N. Nguyen, Random forest classifier combined with feature selection for breast cancer diagnosis and prognostic, J. Biomed. Sci. Engin. 6 (2013), 551–560.
[47] O. Okun and H. Priisalu, Random forest for gene expression based cancer classification: overlooked issues, Iberian Conf. Pattern Recog. Image Anal., Berlin, Heidelberg: Springer Berlin Heidelberg, 2007, pp. 483–490.
[48] M. Pal, Random forests for land cover classification, IEEE Int. Geosci. Remote Sens. Symp., Proc.(IEEE Cat. No. 03CH37477). Vol. 6. IEEE, 2003, pp. 3510–3512.
[49] D.S. Palmer, N.M. O’Boyle, R.C. Glen, and J.B.O. Mitchell, Random forest models to predict aqueous solubility, J. Chem. Inf. Model. 47 (2007), 150–158.
[50] L.E. Peterson, K-nearest neighbor, Scholarpedia 4 (2009), no. 2, 1883.
[51] T. Puntervold, S. Strand, R. Ellouz, and T. Austad, Modified seawater as a smart EOR fluid in chalk, J. Petrol. Sci. Engin. 133 (2015), 440–443.
[52] J. Romanuka, J. Hofman, D.J. Ligthelm, B.M. Suijkerbuijk, A.H. Marcelis, S. Oedai, N.J. Brussee, A. van der Linde, H. Aksulu, and T. Austad, Low salinity EOR in carbonates, SPE Improved Oil Recovery Conf., SPE, 2012.
[53] P. Royston and D.G. Altman, Risk stratification for in-hospital mortality in acutely decompensated heart failure, Jama 293 (2005), no. 20, 2467–2468.
[54] S.F. Shariatpanahi, P. Hopkins, H. Aksulu, S. Strand, T. Puntervold, and T. Austad, Water based EOR by wettability alteration in dolomite, Energy Fuels 30 (2016), no. 1, 180–187.
[55] E. Scornet, On the asymptotics of random forests, J. Multivar. Anal. 146 (2016), 72–83.
[56] E. Scornet, G. Biau, J.P. Vert, Consistency of random forests, Ann. Statist. 43 (2015), 1716–1741.
[57] Y.Y. Song and L.U. Ying, Decision tree methods: Applications for classification and prediction, Shanghai Arch. Psych. 27 (2015), no. 2, 130–135.
[58] D.C. Standnes and T. Austad, Wettability alteration in carbonates: Interaction between cationic surfactant and carboxylates as a key factor in wettability alteration from oil-wet to water-wet conditions, Colloids Surfaces A: Physicochem. Engin. Aspects 216 (2003), 243–259.
[59] S. Strand, T. Puntervold, and T. Austad, Effect of temperature on enhanced oil recovery from mixed-wet chalk cores by spontaneous imbibition and forced displacement using seawater, Energy Fuels 22 (2008), no. 5, 3222–3225.
[60] V. Svetnik, A. Liaw, C. Tong, J.C. Culberson, R.P. Sheridan, and B.P. Feuston, Random forest: A classification and regression tool for compound classification and QSAR modeling, J. Chem. Inf. Comput. Sci. 43 (2003), no. 6, 1947–1958.
[61] K. Tatsumi, Y. Yamashiki, M.A.C. Torres, and C.L.R. Taipe, Crop classification of upland fields using Random forest of time-series Landsat 7 ETM+ data, Comput. Electron. Agricul. 115 (2015), 171–179.
[62] G.K.F. Tso and K.K.W. Yau, Predicting electricity energy consumption: A comparison of regression analysis, decision tree and neural networks, Energy 32 (2007), 1761–1768.
[63] H. Tyralis, G. Papacharalampous, and A. Langousis, A brief review of random forests for water scientists and practitioners and their recent history in water resources, Water 11 (2019), no. 5, 910.
[64] X. Xi, V.S. Sheng, B. Sun, L. Wang, and F. Hu, An empirical comparison on multi-target regression learning, Comput. Mater. Continua 56 (2018), no. 2, 185–198.
[65] M. Xu, P. Watanachaturaporn, P.K. Varshney, and M.K. Arora, Decision tree regression for soft classification of remote sensing data, Remote Sens. Envir. 97 (2005), 322–336.
[66] Z. Yao and W.L. Ruzzo, A regression-based K nearest neighbor algorithm for gene function prediction from het[1]erogenous data, BMC Bioinf. 7 (2006), 1–11.
[67] Z. Yu, F. Haghighat, B.C.M. Fung, and H. Yoshino, A decision tree method for building energy demand modelling, Energy Build. 27 (2015), 1637–1646.
[68] H.M. Zawbaa, M. Hazman, M. Abbass, and A.E. Hassanien, Automatic fruit classification using random forest algorithm, 14th Int. Conf. Hybrid Intell. Syst., 2014, pp. 164–168.
[69] P. Zhang and T. Austad, Wettability and oil recovery from carbonates: Effects of temperature and potential determining ions, Colloids Surfaces A: Physicochem. Engine. Aspects 279 (2006), 179–187.
[70] S. Zhang, X. Li, M. Zong, X. Zhu, and D. Cheng, Learning k for kNN classification, ACM Trans. Intell. Syst. Technol. 8 (2017), no. 3, 1–19.
[71] Y. Zhang and N.R. Morrow, Comparison of secondary and tertiary recovery with change in injection brine composition for crude oil/sandstone combinations, SPE Improved Oil Recovery Conf., SPE, 2006.
[72] X. Zhen, M. Yu, X. He, and S. Li, Multi-target regression via robust low-rank learning, IEEE Trans. Pattern Anal. Machine Intell. 40 (2018), 497–504.
[73] X. Zhen, M. Yu, F. Zheng, I.B. Nachum, M. Bhaduri, D. Laidley, D and S. Li, Multitarget sparse latent regression, IEEE Trans. Neural Networks Learn. Syst. 29 (2017), no. 5, 1575–1586.
[74] A. Ziegler and I.R. Konig, Mining data with random forests: Current options for real-world applications, Wiley Interdiscip. Rev.: Data Min. Knowledge Disc. 4 (2014), 55–63.