[1] H.S.K. Chun, Sparse partial least squares regression for simultaneous dimension reduction and variable selection, J. R. Stat. Soc. Ser. B Stat. Meth. (2010) 3–25.
[2] R. Cook, Regression Graphics: Ideas for Studying the Regression Through Graphics, New York, Wily, 1998.
[3] B.H. Efron, Least angle regression, Ann. Stat. (2004) 407—499.
[4] J.A. Fan, Variable selection via non-concave penalized likelihood and its oracle properties, J. Amer. Stat. Assoc. (2001) 1348–1360.
[5] H and Z. Hastie, Regularization and variable selection via the elastic net, J. Royal Stat Soci (2005) 1418–142.
[6] B. Leo, Better Subset Regression Using the Nonnegative Garrote, Technomet. (1995) 373–384.
[7] K. Li, Sliced inverse regression for dimension reduction (with discussion), J. Amer. Stat. Assoc. (1991) 316–342.
[8] L.L.-X. Li, Groupwise dimension reduction, J. Amer. Stat. Assoc. (2010) 1188–1201.
[9] R. Luo X. Qi, Signal extraction approach for sparse multivariate response regression, J. Multivar. Anal. (2017) 83–97.
[10] Q. Wang and X. Yin, A nonlinear multi-dimensional variable selection method for high dimensional data: Sparse MAVE, Comput. Stat. Data Anal. 52(9) (2008) 4512—4520.
[11] R. Tibshirani, Regression shrinkage and selection via the Lasso, J. Royal Stat. Soc. (1996) 267–288.
[12] T.X. Wang, Penalized minimum average variance estimation, Stat. Sinica (2013) 543–569.
[13] T.X. Wang, Variable selection and estimation for semiparametric multiple-index models, Bernoulli 21(2015) 242–275.
[14] Y.T. Xia, An adaptive estimation of dimension reduction space, J. Royal Stat. Soc. (2002) 363–410.
[15] Yu and K. Alkenani, Sparse MAVE with oracle penalties, Adv. Appl. Stat. (2013) 85—105.
[16] H. A. Zou, Regularization and variable selection via the elastic net, J. Royal Stat. Soc. (2005) 301—320.