🤖 AI Summary
This paper addresses the efficient discrete approximation of Gaussian mixture models (GMMs) under the Wasserstein distance, motivated by dual requirements of quantization accuracy and computational scalability in control and cyber-physical system verification. We propose an enhanced quantization framework that integrates sigma-point sampling with adaptive clustering, enabling robust handling of high-dimensional, large-scale, and degenerate GMMs. A rigorous upper bound on the Wasserstein approximation error is derived, and a modular interface is provided to support customizable approximation schemes. Experiments demonstrate that our method achieves sublinear error convergence while significantly reducing computational overhead—outperforming state-of-the-art approaches in accuracy. The core contribution lies in the first systematic integration of sigma-point mechanisms with Wasserstein quantization theory, thereby unifying theoretical guarantees with practical deployability.
📝 Abstract
We present discretize_distributions, a Python package that efficiently constructs discrete approximations of Gaussian mixture distributions and provides guarantees on the approximation error in Wasserstein distance. The package implements state-of-the-art quantization methods for Gaussian mixture models and extends them to improve scalability. It further integrates complementary quantization strategies such as sigma-point methods and provides a modular interface that supports custom schemes and integration into control and verification pipelines for cyber-physical systems. We benchmark the package on various examples, including high-dimensional, large, and degenerate Gaussian mixtures, and demonstrate that discretize_distributions produces accurate approximations at low computational cost.