Bayesian sparse multiple regression for simultaneous rank reduction and variable selection. Academic Article uri icon

abstract

  • We develop a Bayesian methodology aimed at simultaneously estimating low-rank and row-sparse matrices in a high-dimensional multiple-response linear regression model. We consider a carefully devised shrinkage prior on the matrix of regression coefficients which obviates the need to specify a prior on the rank, and shrinks the regression matrix towards low-rank and row-sparse structures. We provide theoretical support to the proposed methodology by proving minimax optimality of the posterior mean under the prediction risk in ultra-high dimensional settings where the number of predictors can grow sub-exponentially relative to the sample size. A one-step post-processing scheme induced by group lasso penalties on the rows of the estimated coefficient matrix is proposed for variable selection, with default choices of tuning parameters. We additionally provide an estimate of the rank using a novel optimization function achieving dimension reduction in the covariate space. We exhibit the performance of the proposed methodology in an extensive simulation study and a real data example.

published proceedings

  • Biometrika

author list (cited authors)

  • Chakraborty, A., Bhattacharya, A., & Mallick, B. K.

citation count

  • 5

complete list of authors

  • Chakraborty, Antik||Bhattacharya, Anirban||Mallick, Bani K

publication date

  • March 2020