Learning by canonical smooth estimation. I. Simultaneous estimation Academic Article uri icon

abstract

  • This paper examines the problem of learning from examples in a framework that is based on, but more general than, Valiant's probably approximately correct (PAC) model for learning. In our framework, the learner observes examples that consist of sample points drawn and labeled according to a fixed, unknown probability distribution. Based on this empirical data, the learner must select, from a set of candidate functions, a particular function or "hypothesis" that will accurately predict the labels of future sample points. The expected mismatch between a hypothesis' prediction and the label of a new sample point is called the hypothesis' "generalization error." Following the pioneering work of Vapnik and Chervonenkis, others have attacked this sort of learning problem by finding hypotheses that minimize the relative frequency-based empirical error estimate. We generalize this approach by examining the "simultaneous estimation" problem: When does some procedure exist for estimating the generalization error of all of the candidate hypotheses, simultaneously, from the same labeled sample? We demonstrate how one can learn from such a simultaneous error estimate and propose a new class of estimators called "smooth estimators" that, in many cases of interest, contains the empirical estimator. We characterize the class of simultaneous estimation problems solvable by a smooth estimator and give a canonical form for the smooth simultaneous estimator. 1996 IEEE.

published proceedings

  • IEEE Transactions on Automatic Control

author list (cited authors)

  • Buescher, K. L., & Kumar, P. R.

citation count

  • 17

complete list of authors

  • Buescher, KL||Kumar, PR

publication date

  • April 1996