Mixed Effects Mixture of Experts: Modeling Double Heterogeneous Trajectories

📅 2026-03-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of modeling dual heterogeneity in longitudinal trajectories arising from both fixed and random effects, which traditional linear mixed-effects models (LMMs) struggle to capture effectively. To this end, the authors propose a Mixture of Experts for Mixed-Effects models (MEMoE), which uniquely integrates the mixture-of-experts architecture with LMMs by assigning each expert to a full LMM representing a distinct latent subgroup. A gating function, driven by baseline covariates, probabilistically allocates individuals to these subgroups. Parameter estimation is performed via a Laplace-approximated EM algorithm, complemented by a robust sandwich estimator to adjust standard errors and enhance inference reliability. Empirical evaluations demonstrate that MEMoE significantly outperforms conventional single-population LMMs and standard mixture-of-experts models in terms of parameter recovery, classification accuracy, and overall model fit.

Technology Category

Application Category

📝 Abstract
Linear mixed-effects model (LMM) is a cornerstone of longitudinal data analysis, but is limited to adeptly make heterogeneous analyses predictable under both group-specific fixed effects and subject-specific random effects. To address this challenge, we propose a novel statistical framework by using a large model prototype: a mixed effects mixture of experts model (MEMoE). This framework integrates the divide-and-conquer paradigm of Mixture of Experts Models with classical mixed-effect modeling. In the proposed MEMoE, each expert is a full LMM dedicated to capturing the longitudinal trajectory of a specific latent subpopulation, while another model gating function learns to route subjects to the most appropriate expert in a data-driven manner based on baseline covariates. We develop a robust inferential procedure for parameter estimation based on the Laplace Expectation-Maximization algorithm, with standard errors calibrated using robust sandwich estimators to account for potential model misspecification. Extensive simulation studies and an empirical application demonstrate that MEMoE outperforms both traditional single-population LMM and conventional Mixture of Experts models in terms of parameter recovery, classification accuracy, and overall model fit.
Problem

Research questions and friction points this paper is trying to address.

mixed-effects model
heterogeneous trajectories
longitudinal data
mixture of experts
double heterogeneity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixed Effects Mixture of Experts
Longitudinal Data Analysis
Latent Subpopulation
Laplace EM Algorithm
Robust Sandwich Estimator
🔎 Similar Papers
No similar papers found.
X
Xinkai Yue
School of Mathematics and Statistics, Xi’an Jiaotong University
Xiaodong Yan
Xiaodong Yan
Unknown affiliation
统计学,机器学习
H
Haohui Han
School of Mathematics and Statistics, Xi’an Jiaotong University
L
Liya Fu
School of Mathematics and Statistics, Xi’an Jiaotong University