Adaptive Canonicalization with Application to Invariant Anisotropic Geometric Networks

📅 2025-09-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In equivariant machine learning, conventional normalization enforces symmetry by mapping inputs to fixed canonical forms, but often introduces discontinuities that impair training stability, generalization, and universal approximation capability. This work proposes an adaptive normalization framework: for the first time, the canonical form is modeled as a joint function of network parameters and input, dynamically selecting the optimal canonical representation based on prediction confidence—thereby rigorously guaranteeing continuity, equivariance preservation, and universal approximation. The method integrates spectral graph neural networks with anisotropic geometric networks to explicitly disentangle feature basis ambiguity from rotational symmetries. Evaluated on molecular and protein classification, as well as point cloud classification tasks, it substantially outperforms data augmentation, fixed-normalization baselines, and state-of-the-art equivariant architectures.

Technology Category

Application Category

📝 Abstract
Canonicalization is a widely used strategy in equivariant machine learning, enforcing symmetry in neural networks by mapping each input to a standard form. Yet, it often introduces discontinuities that can affect stability during training, limit generalization, and complicate universal approximation theorems. In this paper, we address this by introducing emph{adaptive canonicalization}, a general framework in which the canonicalization depends both on the input and the network. Specifically, we present the adaptive canonicalization based on prior maximization, where the standard form of the input is chosen to maximize the predictive confidence of the network. We prove that this construction yields continuous and symmetry-respecting models that admit universal approximation properties. We propose two applications of our setting: (i) resolving eigenbasis ambiguities in spectral graph neural networks, and (ii) handling rotational symmetries in point clouds. We empirically validate our methods on molecular and protein classification, as well as point cloud classification tasks. Our adaptive canonicalization outperforms the three other common solutions to equivariant machine learning: data augmentation, standard canonicalization, and equivariant architectures.
Problem

Research questions and friction points this paper is trying to address.

Addresses discontinuities in canonicalization affecting training stability and generalization
Resolves eigenbasis ambiguities in spectral graph neural networks
Handles rotational symmetries in point clouds for improved classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive canonicalization framework depending on input and network
Prior maximization selects canonical form for confidence
Resolves eigenbasis ambiguities and rotational symmetries applications
🔎 Similar Papers
No similar papers found.