🤖 AI Summary
Existing discrete latent factor models (DLFMs) rely on custom-built solvers, resulting in high implementation costs and poor generalizability. To address this, we propose DLFM-CVX—the first generic multi-convex optimization framework for DLFMs—built atop CVXPY to provide a declarative modeling interface. Our method unifies modeling across diverse DLFM variants (e.g., regression and classification), enables seamless integration of regularization, and supports joint optimization of parameters and latent factors. Crucially, DLFM-CVX enables plug-and-play structural exploration of DLFMs without modifying underlying solvers, while accommodating sparsity-inducing penalties, structured constraints, and cross-domain transfer. An open-source Python implementation is validated on multiple benchmark tasks, reducing prototype development to just a few lines of code. Empirical results demonstrate substantial improvements in modeling efficiency, flexibility, and reusability.
📝 Abstract
Discrete latent factor models (DLFMs) are widely used in various domains such as machine learning, economics, neuroscience, psychology, etc. Currently, fitting a DLFM to some dataset relies on a customized solver for individual models, which requires lots of effort to implement and is limited to the targeted specific instance of DLFMs. In this paper, we propose a generic framework based on CVXPY, which allows users to specify and solve the fitting problem of a wide range of DLFMs, including both regression and classification models, within a very short script. Our framework is flexible and inherently supports the integration of regularization terms and constraints on the DLFM parameters and latent factors, such that the users can easily prototype the DLFM structure according to their dataset and application scenario. We introduce our open-source Python implementation and illustrate the framework in several examples.