🤖 AI Summary
This work addresses the problem of randomized dimensionality reduction for real algebraic varieties and images of polynomial maps—such as low-rank tensors and tensor networks. We develop a unified sketching theory framework for such structured algebraic sets. Our method introduces the *median sketch*, the first approach achieving norm-preserving embedding with only $ ilde{O}(dim V)$ measurements—sharply improving upon the classical $ ilde{O}((dim V)^2)$ bound. The median sketch is compatible with diverse sketching operators, including sub-Gaussian, fast Johnson–Lindenstrauss (JL), and tensor-structured sketches. Leveraging generalized set theory, we provide a unified characterization of their embedding guarantees. Theoretical results ensure efficient compression and accelerated computation for high-dimensional tensors and polynomial-structured data. This yields a tight, scalable foundation for large-scale tensor learning, enabling provably accurate low-dimensional representations while preserving essential algebraic geometry properties.
📝 Abstract
This paper develops the sketching (i.e., randomized dimension reduction) theory for real algebraic varieties and images of polynomial maps, including, e.g., the set of low rank tensors and tensor networks. Through the lens of norming sets, we provide a framework for controlling the sketching dimension for extit{any} sketch operator used to embed said sets, including sub-Gaussian, fast Johnson-Lindenstrauss, and tensor structured sketch operators. Leveraging norming set theory, we propose a new sketching method called the median sketch. It embeds such a set $V$ using only $widetilde{mathcal{O}}(dim V)$ tensor structured or sparse linear measurements.