S4 conference

Key-note speakers

Mixture Models for Longitudinal Data with Small Samples – Daniel McNeish

Growth mixture models (GMMs) are a popular method to uncover latent subgroups with longitudinal data. Harnessing the power of GMMs in applications can be difficult given commonly encountered convergence difficulties when fitting GMMs to empirical data. These issues often lead researchers to modify their intended model with constraints to ease estimation. Despite multiple methodological studies noting the problem of such an approach, the computational complexities of GMMs often leave researchers with few alternatives when combating convergence issues. Bayesian estimation with informative priors is a common recommendation under these circumstances, but creating priors from the existing substantive literature can be difficult because a large portion of the empirical literature is based on models prioritizing convergence above all else. Instead, this talk discusses extending covariance pattern models to mixture contexts to improve convergence and performance. Covariance pattern models enjoyed success decades ago when random effects models pushed computational limits; this paper proposes to resurrect these models to similarly address convergence issues in GMMs when sample sizes are small and there are large amounts of dropout. Simulation results show that, with N=100 and 45% dropout, convergence rates increase from less than 10% with traditional methods to over 95% with the proposed method, while also improving estimation of class proportions and class growth trajectories.

Daniel McNeish
Arizona State University