On the Improved Rates of Convergence for Matern-Type Kernel Ridge Regression with Application to Calibration of Computer Models Academic Article uri icon

abstract

  • Kernel ridge regression is an important nonparametric method for estimating smooth functions. We introduce a new set of conditions, under which the actual rates of convergence of the kernel ridge regression estimator under both the L_2 norm and the norm of the reproducing kernel Hilbert space exceed the standard minimax rates. An application of this theory leads to a new understanding of the Kennedy-O'Hagan approach for calibrating model parameters of computer simulation. We prove that, under certain conditions, the Kennedy-O'Hagan calibration estimator with a known covariance function converges to the minimizer of the norm of the residual function in the reproducing kernel Hilbert space.

published proceedings

  • SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION

author list (cited authors)

  • Tuo, R., Wang, Y., & Wu, C.

citation count

  • 11

complete list of authors

  • Tuo, Rui||Wang, Yan||Wu, CF Jeff

publication date

  • January 2020