Data assimilation and sampling in Banach spaces
Additional Document Info
2017, Springer-Verlag Italia. This paper studies the problem of approximating a function f in a Banach space X from measurements l j (f) , j= 1 , , m, where the l j are linear functionals from X . Quantitative results for such recovery problems require additional information about the sought after function f. These additional assumptions take the form of assuming that f is in a certain model class K X. Since there are generally infinitely many functions in K which share these same measurements, the best approximation is the center of the smallest ball B, called the Chebyshev ball, which contains the set K of all f in K with these measurements. Therefore, the problem is reduced to analytically or numerically approximating this Chebyshev ball. Most results study this problem for classical Banach spaces X such as the L p spaces, 1 p , and for K the unit ball of a smoothness space in X. Our interest in this paper is in the model classes K= K(, V) , with > 0 and V a finite dimensional subspace of X, which consists of all f X such that dist (f, V) X . These model classes, called approximation sets, arise naturally in application domains such as parametric partial differential equations, uncertainty quantification, and signal processing. A general theory for the recovery of approximation sets in a Banach space is given. This theory includes tight a priori bounds on optimal performance and algorithms for finding near optimal approximations. It builds on the initial analysis given in Maday et al. (Int J Numer Method Eng 102:933965, 2015) for the case when X is a Hilbert space, and further studied in Binev et al. (SIAM UQ, 2015). It is shown how the recovery problem for approximation sets is connected with well-studied concepts in Banach space theory such as liftings and the angle between spaces. Examples are given that show how this theory can be used to recover several recent results on sampling and data assimilation.