## Abstract

. Sparsity is an essential feature of many contemporary data problems. Remote sensing, various forms of automated screening and other high throughput measurement devices collect a large amount of information, typically about few independent statistical subjects or units. In certain cases it is reasonable to assume that the underlying process generating the data is itself sparse, in the sense that only a few of the measured variables are involved in the process. We propose an explicit method of monotonically decreasing sparsity for outcomes that can be modelled by an exponential family. In our approach we generalize the equiangular condition in a generalized linear model. Although the geometry involves the Fisher information in a way that is not obvious in the simple regression setting, the equiangular condition turns out to be equivalent with an intuitive condition imposed on the Rao score test statistics. In certain special cases the method can be tweaked to obtain L1-penalized generalized linear model solution paths, but the method itself defines sparsity more directly. Although the computation of the solution paths is not trivial, the method compares favourably with other path following algorithms.

Original language | English |
---|---|

Pages (from-to) | 471-498 |

Number of pages | 28 |

Journal | Journal of the Royal Statistical Society. Series B: Statistical Methodology |

Volume | 75 |

DOIs | |

Publication status | Published - 2013 |

## Keywords

- Covariance penalty theory
- Differential geometry
- Generalized degrees of freedom
- Generalized linear models
- Information geometry
- Least angle regression
- Path following algorithm
- Sparse models
- Variable selection
- DANTZIG SELECTOR
- COORDINATE DESCENT
- LASSO
- SHRINKAGE
- ERROR