Oracle inequalities for high-dimensional prediction Academic Article uri icon

abstract

  • 2019 ISI/BS. The abundance of high-dimensional data in the modern sciences has generated tremendous interest in penalized estimators such as the lasso, scaled lasso, square-root lasso, elastic net, and many others. In this paper, we establish a general oracle inequality for prediction in high-dimensional linear regression with such methods. Since the proof relies only on convexity and continuity arguments, the result holds irrespective of the design matrix and applies to a wide range of penalized estimators. Overall, the bound demonstrates that generic estimators can provide consistent prediction with any design matrix. From a practical point of view, the bound can help to identify the potential of specific estimators, and they can help to get a sense of the prediction accuracy in a given application.

published proceedings

  • BERNOULLI

altmetric score

  • 0.25

author list (cited authors)

  • Lederer, J., Yu, L. u., & Gaynanova, I.

citation count

  • 20

complete list of authors

  • Lederer, Johannes||Yu, Lu||Gaynanova, Irina

publication date

  • May 2019