Learning stochastic functions by smooth simultaneous estimation Conference Paper uri icon

abstract

  • To learn, it suffices to estimate the error of all candidate hypotheses simultaneously. We study the problem of when this `simultaneous estimation' is possible and show that it leads to new learning procedures and weaker sufficient conditions for a broad class of learning problems. We modify the standard Probably Approximately Correct (PAC) setup to allow concepts that are `stochastic functions.' A deterministic function maps a set X into a set Y, whereas a stochastic function is a probability distribution on X×Y. We approach the simultaneous estimation problem by concentrating on a subset of all estimators, those that satisfy a natural `smoothness' constraint. The common empirical estimator falls within this class. We show that smooth simultaneous estimability can be characterized by a sampling-based criterion. Also, we describe a canonical estimator for this class of problems. This canonical estimator has a unique form: it uses part of the samples to select a finite subset of hypotheses that approximates the class of candidate hypotheses, and then it uses the rest of the samples to estimate the error of each hypothesis in the subset. Finally, we show that a learning procedure based on the canonical estimator will work in every case where empirical error minimization does.

name of conference

  • the fifth annual workshop

published proceedings

  • Proceedings of the fifth annual workshop on Computational learning theory - COLT '92

author list (cited authors)

  • Buescher, K. L., & Kumar, P. R

citation count

  • 4

complete list of authors

  • Buescher, Kevin L||Kumar, PR

publication date

  • December 1992