Exit Menu

A differential geometric approach to generalize the least angle regression method

Speaker: Luigi Augugliaro (Department of Statistics, University of Palermo)


Nowadays, high dimensional variable selection plays an important role in regression models applied in many areas of modern scientific research such as microarray analysis, genomics or proteomics. In these examples, the number of variables is much larger than the sample size, then traditional statistical methods cannot be used because they are not developed to deal with this new aspects of the actual problems (Donoho, 2000). Many modern variable selection methods for high dimensional statistical models are based on a penalized likelihood approach. Lasso estimator proposed by Tibshirani (1996), the smoothly clipped absolute deviation (SCAD) method developed by Fan and Li (2001), L1-regularization path following algorithm for generalized linear models proposed by Park and Hastie Park and Hastie (2007) or the Dantzig selector proposed by Candes and Tao (2007) are only some of the most popular methods used to select relevant variables in a regression model. In a recent paper, Augugliaro et al. (2013) proposed an extension of the least angle regression method (Efron et al., 2004), called differential geometric least angle regression (dgLARS), based on the relationship between a generalized linear model (McCullagh and Nelder, 1989) and the differential geometry. In this talk I’ll present the main ideas underlying the this method, the computational aspects on which is based the dglars package (Augugliaro et al., 2014), and an extension to generalized linear models with grouped predictors Augugliaro et al. (2016).