๐ค AI Summary
This work addresses the problem of modeling invariant functions over symmetric matrices (under conjugation by permutations) and point clouds (under rotations, reflections, and point permutations). Methodologically, inspired by Galois theory, we construct the first lightweight universal approximator framework that yields separating invariant features of dimensionality only $O(n^2)$ for symmetric matrices and further optimizes to $O(n)$ for point cloudsโbreaking the bottleneck of traditional high-dimensional invariant representations. Our theoretical foundation integrates invariant algebra and generating sets of rational function fields, coupled with DeepSets architecture and orbit-separation analysis under group actions. Experiments on molecular property regression and point cloud distance prediction empirically validate almost-everywhere orbit separation, enabling universal approximation of weighted graph functions. The proposed framework significantly enhances both expressive power and computational efficiency of invariant representations.
๐ Abstract
In this work, we present a mathematical formulation for machine learning of (1) functions on symmetric matrices that are invariant with respect to the action of permutations by conjugation, and (2) functions on point clouds that are invariant with respect to rotations, reflections, and permutations of the points. To achieve this, we provide a general construction of generically separating invariant features using ideas inspired by Galois theory. We construct $O(n^2)$ invariant features derived from generators for the field of rational functions on $n imes n$ symmetric matrices that are invariant under joint permutations of rows and columns. We show that these invariant features can separate all distinct orbits of symmetric matrices except for a measure zero set; such features can be used to universally approximate invariant functions on almost all weighted graphs. For point clouds in a fixed dimension, we prove that the number of invariant features can be reduced, generically without losing expressivity, to $O(n)$, where $n$ is the number of points. We combine these invariant features with DeepSets to learn functions on symmetric matrices and point clouds with varying sizes. We empirically demonstrate the feasibility of our approach on molecule property regression and point cloud distance prediction.