University of Minnesota
Parsimonious Modeling of the Inverse Regression
Abstract: In the first part of the talk, we propose a new class of estimators of the multivariate response linear regression coefficient matrix that exploits the assumption that the response and predictors have a joint multivariate Normal distribution. This allows us to indirectly estimate the regression coefficient matrix through shrinkage estimation of the parameters of the inverse regression, or the conditional distribution of the predictors given the responses. We establish a convergence rate bound for estimators in our class. These estimators do not require the popular assumption that the forward regression coefficient matrix is sparse or has small Frobenius norm. In this second part of the talk, we propose a penalized likelihood method to fit the linear discriminant analysis model when the predictor is matrix valued. We assume the precision matrix has a Kronecker product decomposition. Our penalties encourage pairs of response category mean matrix estimators to have equal entries and also encourage zeros in the precision matrix estimator. To compute our estimators, we use a blockwise coordinate descent algorithm. To update the optimization variables corresponding to response category mean matrices, we use an alternating minimization algorithm that takes advantage of the Kronecker structure of the precision matrix.
Tuesday January 17, 2017 at 3:00 PM in SEO 636