🤖 AI Summary
This paper addresses the computation of basis-generating polynomials and weighted basis maximization for regular matroids. We propose a unified framework: leveraging Seymour’s decomposition to construct a fine-grained decomposition path that preserves graphic substructures, combined with localized star–network transformations, enabling—for the first time—the explicit construction of a uniform (+, ×, /) arithmetic circuit of size O(n³) for computing the basis-generating polynomial. Subsequently, we tropicalize the problem to reduce weighted basis maximization to tropical evaluation of this polynomial, and implement an efficient approximation via ReLU neural networks. This constitutes the first efficient explicit virtual extension formulation for matroids, improving upon prior O(n⁶)-size constructions by a cubic factor. Moreover, it provides both foundational theoretical support and a concrete algorithmic instantiation for the practical deployment of virtual extensions in linear programming.
📝 Abstract
We prove that there exist uniform $(+, imes,/)$-circuits of size $O(n^3)$ to compute the basis generating polynomial of regular matroids on $n$ elements. By tropicalization, this implies that there exist uniform $(max,+,-)$-circuits and ReLU neural networks of the same size for weighted basis maximization of regular matroids. As a consequence in linear programming theory, we obtain a first example where taking the difference of two extended formulations can be more efficient than the best known individual extended formulation of size $O(n^6)$ by Aprile and Fiorini. Such differences have recently been introduced as virtual extended formulations. The proof of our main result relies on a fine-tuned version of Seymour's decomposition of regular matroids which allows us to identify and maintain graphic substructures to which we can apply a local version of the star-mesh transformation.