In many applications, indexing of high-dimensional data has become increasingly important. High-dimensional data is characterized by multiple dimensions. There can be thousands, if not millions, of dimensions in applications. Classic methods cannot analyse this kind of data set. So, we need the appropriate alternative methods to analyse them. In high-dimensional data sets, since the number of predictors is greater than the sample size, it is generally impossible to apply classical methods to fit a efficient model. A popular method for combating the challenge of the high-dimensionality curse is to solve a penalized least squares optimization problem, which combines the residual sum of squares loss function measuring the goodness of the fitted model to the data sets with some penalization terms that promote the underlying structure. So, the penalized methods can analyse and provide a good fit for the high-dimensional data sets. In this paper, we express some of these approaches and then, apply them to the eye data set for investigating the computational performance of the proposed methods.