Modeling the Cholesky factors of covariance matrices of multivariate longitudinal data Academic Article uri icon

abstract

  • 2015 Elsevier Inc. Modeling the covariance matrix of multivariate longitudinal data is more challenging as compared to its univariate counterpart due to the presence of correlations among multiple responses. The modified Cholesky block decomposition reduces the task of covariance modeling into parsimonious modeling of its two matrix factors: the regression coefficient matrices and the innovation covariance matrices. These parameters are statistically interpretable, however ensuring positive-definiteness of several (innovation) covariance matrices presents itself as a new challenge. We address this problem using a subclass of Anderson's (1973) linear covariance models and model several covariance matrices using linear combinations of known positive-definite basis matrices with unknown non-negative scalar coefficients. A novelty of this approach is that positive-definiteness is guaranteed by construction; it removes a drawback of Anderson's model and hence makes linear covariance models more realistic and viable in practice. Maximum likelihood estimates are computed using a simple iterative majorization-minimization algorithm. The estimators are shown to be asymptotically normal and consistent. Simulation and a data example illustrate the applicability of the proposed method in providing good models for the covariance structure of a multivariate longitudinal data.

published proceedings

  • JOURNAL OF MULTIVARIATE ANALYSIS

author list (cited authors)

  • Kohli, P., Garcia, T. P., & Pourahmadi, M.

citation count

  • 9

complete list of authors

  • Kohli, Priya||Garcia, Tanya P||Pourahmadi, Mohsen

publication date

  • January 2016