🤖 AI Summary
This paper addresses the challenge of extending identifiability results for kernel functions from classical settings to novel kernels in continuous mixture models. We propose a new criterion—“generating-function accessibility”—which, for the first time, establishes a unified identifiability test grounded in moment-generating functions and Laplace transforms, enabling systematic generalization across both discrete and continuous mixed variables. Our approach overcomes the limitations of conventional methods that rely on kernel-specific structural assumptions, thereby extending classical identifiability guarantees (e.g., for Gaussian and exponential kernels) to broad classes of nonstandard kernels—including generalized Gamma, Weibull, and log-normal kernels. Empirical validation on canonical kernels confirms the method’s effectiveness and practical utility. The core contribution lies in establishing a generating-function-level identifiability transfer mechanism, yielding a scalable, theoretically grounded analytical framework for mixture modeling.
📝 Abstract
For continuous mixtures of random variables, we provide a simple criterion -- generating-function accessibility -- to extend previously known kernel-based identifiability (or unidentifiability) results to new kernel distributions. This criterion, based on functional relationships between the relevant kernels' moment-generating functions or Laplace transforms, may be applied to continuous mixtures of both discrete and continuous random variables. To illustrate the proposed approach, we present results for several specific kernels.