In this paper, we consider a broad class of interpolation problems, for both scalar- and vector-valued multivariate functions subject to linear side conditions, such as being divergence-free, where the data are generated via integration against compactly supported distributions. We show that, by using certain families of matrix-valued conditionally positive definite functions, such interpolation problems are well poised; that is, the interpolation matrices are invertible. As a sample result, we show that a divergence-free vector field can be interpolated by a linear combination of convolutions of the data-generating distributions with a divergence-free, 3x3 matrix-valued conditionally positive definite function. In addition, we obtain norm estimates for inverses of interpolation matrices that arise in a class of multivariate Hermite interpolation problems. 1994 American Mathematical Society.