Scalable Learning of Multivariate Distributions via Coresets

๐Ÿ“… 2026-03-20
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work proposes the first coreset construction for multivariate conditional transformation models (MCTMs) to address the computational inefficiency of existing nonparametric and semiparametric regression and density estimation methods on large-scale data. By integrating importance sampling with a convex hullโ€“based geometric approximation, the approach effectively tackles the intractable log-normalization constant and guarantees, with high probability, that the log-likelihood error remains within a $(1\pm\varepsilon)$ multiplicative factor of the true value. The resulting coreset significantly enhances training efficiency and scalability while preserving statistical accuracy and improving the modelโ€™s capacity to capture complex nonlinear distributions. This advancement establishes a new paradigm for scalable and accurate distribution modeling in high-dimensional settings.

Technology Category

Application Category

๐Ÿ“ Abstract
Efficient and scalable non-parametric or semi-parametric regression analysis and density estimation are of crucial importance to the fields of statistics and machine learning. However, available methods are limited in their ability to handle large-scale data. We address this issue by developing a novel coreset construction for multivariate conditional transformation models (MCTMs) to enhance their scalability and training efficiency. To the best of our knowledge, these are the first coresets for semi-parametric distributional models. Our approach yields substantial data reduction via importance sampling. It ensures with high probability that the log-likelihood remains within multiplicative error bounds of $(1\pm\varepsilon)$ and thereby maintains statistical model accuracy. Compared to conventional full-parametric models, where coresets have been incorporated before, our semi-parametric approach exhibits enhanced adaptability, particularly in scenarios where complex distributions and non-linear relationships are present, but not fully understood. To address numerical problems associated with normalizing logarithmic terms, we follow a geometric approximation based on the convex hull of input data. This ensures feasible, stable, and accurate inference in scenarios involving large amounts of data. Numerical experiments demonstrate substantially improved computational efficiency when handling large and complex datasets, thus laying the foundation for a broad range of applications within the statistics and machine learning communities.
Problem

Research questions and friction points this paper is trying to address.

scalability
multivariate distributions
large-scale data
semi-parametric models
density estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

coresets
multivariate conditional transformation models
semi-parametric distributional models
importance sampling
scalable density estimation
๐Ÿ”Ž Similar Papers
No similar papers found.