Exact Sample Conditioned MSE Performance of the Bayesian MMSE Estimator for Classification Error-Part I: Representation Academic Article uri icon

abstract

  • In recent years, biomedicine has been faced with difficult high-throughput small-sample classification problems. In such settings, classifier error estimation becomes a critical issue because training and testing must be done on the same data. A recently proposed error estimator places the problem in a signal estimation framework in the presence of uncertainty, permitting a rigorous solution optimal in a minimum-mean-square error sense. The uncertainty in this model is relative to the parameters of the feature-label distributions, resulting in a Bayesian approach to error estimation. Closed form solutions are available for two important problems: discrete classification with Dirichlet priors and linear classification of Gaussian distributions with normal-inverse-Wishart priors. In this work, Part I of a two-part study, we introduce the theoretical mean-square-error (MSE) conditioned on the observed sample of any estimate of the classifier error, including the Bayesian error estimator, for both Bayesian models. Thus, Bayesian error estimation has a unique advantage in that its mathematical framework naturally gives rise to a practical expected measure of performance given an observed sample. In Part II of the study we examine consistency of the error estimator, demonstrate various MSE properties, and apply the conditional MSE to censored sampling. 1991-2012 IEEE.

published proceedings

  • IEEE TRANSACTIONS ON SIGNAL PROCESSING

author list (cited authors)

  • Dalton, L. A., & Dougherty, E. R.

citation count

  • 31

publication date

  • May 2012