Learning from Non-Random Data in Hilbert Spaces: An Optimal Recovery Perspective Academic Article uri icon

abstract

  • The notion of generalization in classical Statistical Learning is often attached to the postulate that data points are independent and identically distributed (IID) random variables. While relevant in many applications, this postulate may not hold in general, encouraging the development of learning frameworks that are robust to non-IID data. In this work, we consider the regression problem from an Optimal Recovery perspective. Relying on a model assumption comparable to choosing a hypothesis class, a learner aims at minimizing the worst-case error, without recourse to any probabilistic assumption on the data. We first develop a semidefinite program for calculating the worst-case error of any recovery map in finite-dimensional Hilbert spaces. Then, for any Hilbert space, we show that Optimal Recovery provides a formula which is user-friendly from an algorithmic point-of-view, as long as the hypothesis class is linear. Interestingly, this formula coincides with kernel ridgeless regression in some cases, proving that minimizing the average error and worst-case error can yield the same solution. We provide numerical experiments in support of our theoretical findings.

author list (cited authors)

  • Foucart, S., Liao, C., Shahrampour, S., & Wang, Y.

complete list of authors

  • Foucart, Simon||Liao, Chunyang||Shahrampour, Shahin||Wang, Yinsong

publication date

  • January 2020