Elementary estimators for high-dimensional linear regression
Eunho Yang, Aurelie C. Lozano, et al.
ICML 2014
We consider the problem of estimating expectations of vector-valued feature functions; a special case of which includes estimating the co- variance matrix of a random vector. We are in-terested in recovery under high-dimensional settings, where the number of features p is poten-tially larger than the number of samples n, and where we need to impose structural constraints. In a natural distributional setting for this problem, the feature functions comprise the sufficient statistics of an exponential family, so that the problem would entail estimating structured mo-ments of exponential family distributions. For instance, in the special case of covariance esti-mation, the natural distributional setting would correspond to the multivariate Gaussian distribu-tion. Unlike the inverse covariance estimation case, we show that the regularized MLEs for co- variance estimation, as well as natural Dantzig variants, are non-convex, even when the regular- ization functions themselves are convex; with the same holding for the general structured moment case. We propose a class of elementary convex estimators, that in many cases are available in closed-form, for estimating general structured moments. We then provide a unified statistical analysis of our class of estimators. Finally, we demonstrate the applicability of our class of estimators via simulation and on real-world climatology and biology datasets.
Eunho Yang, Aurelie C. Lozano, et al.
ICML 2014
Saurabh Paul, Christos Boutsidis, et al.
JMLR
C.A. Micchelli, W.L. Miranker
Journal of the ACM
Joxan Jaffar
Journal of the ACM