As an extension of hierarchical linear models (HLMs), cross-classified random effects models (CCREMs) are used for analyzing multilevel data that do not have strictly hierarchical structures. Proportional reduction in prediction error, a multilevel version of the R2 in ordinary multiple regression, measures the predictive ability of a model and is useful in model selection. However, such a measure is not yet available for CCREMs. Using a two-level random-intercept CCREM, the authors have investigated how the estimated variance components change when predictors are added and have extended the measures of proportional reduction in prediction error from HLMs to CCREMs. The extended measures are generally unbiased for both balanced and unbalanced designs. An example is provided to illustrate the computation and interpretation of these measures in CCREMs.