Worst case and mean squared performance of imaging systems: A feature-based approach
Conference Paper

Overview

Additional Document Info

View All

Overview

abstract

In this paper, we quantify the effect of random noise on the probability of misclassification of images. We consider two metrics for the noise corrupting the image: the mean squared error (MSE) and the worst case error (WCE). We show that these are consistent with the goal of image classification in that, as the MSE or WCE tends to zero, the probability of misclassifying an image also tends to zero. Given a feature map, i.e., a real-valued function of an image variable, we assume that classification is done by applying a threshold to the feature map. In this feature-based classification, we find bounds on the MSE and the WCE such that the probability of misclassifying the image is guaranteed to be less than some pre-specified value. We illustrate the theory through an example where the banded appearance of the image of a planet is detected. We also show that, in the special case of a linear feature map, finding an estimate that minimizes the probability of misclassification reduces to a problem of finding a minimum weighted-MSE estimate. The results of this paper could be used for the reliable characterization of exo-solar planets and similar astronomical studies.