On Admissible Rank-based Input Normalization Operators

📅 2025-12-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the fundamental instability of rank-based input normalization under monotonic transformations, batch variations, and small perturbations. To formalize desiderata, we introduce three axioms characterizing minimal invariance and stability requirements for valid rank-normalization operators. We prove that any operator satisfying these axioms must decompose into a rank representation followed by a Lipschitz-monotonic scalarization mapping. This characterization reveals that mainstream differentiable sorting operators are inherently unstable due to their dependence on value gaps and pairwise interactions. Leveraging the theory, we construct the first minimally differentiable rank-normalization operator that strictly satisfies all axioms. Empirical evaluation on multi-task learning and robust classification tasks demonstrates its superior stability and practical necessity over existing methods.

Technology Category

Application Category

📝 Abstract
Rank-based input normalization is a workhorse of modern machine learning, prized for its robustness to scale, monotone transformations, and batch-to-batch variation. In many real systems, the ordering of feature values matters far more than their raw magnitudes - yet the structural conditions that a rank-based normalization operator must satisfy to remain stable under these invariances have never been formally pinned down. We show that widely used differentiable sorting and ranking operators fundamentally fail these criteria. Because they rely on value gaps and batch-level pairwise interactions, they are intrinsically unstable under strictly monotone transformations, shifts in mini-batch composition, and even tiny input perturbations. Crucially, these failures stem from the operators' structural design, not from incidental implementation choices. To address this, we propose three axioms that formalize the minimal invariance and stability properties required of rank-based input normalization. We prove that any operator satisfying these axioms must factor into (i) a feature-wise rank representation and (ii) a scalarization map that is both monotone and Lipschitz-continuous. We then construct a minimal operator that meets these criteria and empirically show that the resulting constraints are non-trivial in realistic setups. Together, our results sharply delineate the design space of valid rank-based normalization operators and formally separate them from existing continuous-relaxation-based sorting methods.
Problem

Research questions and friction points this paper is trying to address.

Identifies structural instability in rank-based normalization under transformations
Proposes axioms for invariance and stability in rank-based normalization
Constructs a minimal operator meeting formal criteria for robustness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Axioms formalizing invariance and stability for rank-based normalization
Operator factorization into rank representation and monotone Lipschitz scalarization
Minimal operator construction separating from continuous-relaxation sorting methods
🔎 Similar Papers
No similar papers found.