ArrowFlow: Hierarchical Machine Learning in the Space of Permutations

📅 2026-04-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a gradient-free deep learning architecture operating entirely in ordinal space, eschewing floating-point parameters and gradient-based optimization. The method propagates permutation-based representations layer-wise through ranking filters, measures input similarity via Spearman footrule distance, and updates parameters by accumulating permutation matrices. Notably, it reformulates Arrow’s impossibility theorem as an inductive bias, enabling deployment on integer-based and neuromorphic hardware. Empirical evaluation demonstrates that the model achieves a 2.7% error rate on the Iris dataset—outperforming all baselines—and exhibits competitive performance across multiple UCI benchmarks. Moreover, low-order configurations substantially enhance robustness to noise, privacy preservation, and tolerance to missing features.
📝 Abstract
We introduce ArrowFlow, a machine learning architecture that operates entirely in the space of permutations. Its computational units are ranking filters, learned orderings that compare inputs via Spearman's footrule distance and update through permutation-matrix accumulation, a non-gradient rule rooted in displacement evidence. Layers compose hierarchically: each layer's output ranking becomes the next layer's input, enabling deep ordinal representation learning without any floating-point parameters in the core computation. We connect the architecture to Arrow's impossibility theorem, showing that violations of social-choice fairness axioms (context dependence, specialization, symmetry breaking) serve as inductive biases for nonlinearity, sparsity, and stability. Experiments span UCI tabular benchmarks, MNIST, gene expression cancer classification (TCGA), and preference data, all against GridSearchCV-tuned baselines. ArrowFlow beats all baselines on Iris (2.7% vs. 3.3%) and is competitive on most UCI datasets. A single parameter, polynomial degree, acts as a master switch: degree 1 yields noise robustness (8-28% less degradation), privacy preservation (+0.5pp cost), and missing-feature resilience; higher degrees trade these for improved clean accuracy. ArrowFlow is not designed to surpass gradient-based methods. It is an existence proof that competitive classification is possible in a fundamentally different computational paradigm, one that elevates ordinal structure to a first-class citizen, with natural alignment to integer-only and neuromorphic hardware.
Problem

Research questions and friction points this paper is trying to address.

permutations
ordinal representation learning
non-gradient learning
ranking filters
Arrow's impossibility theorem
Innovation

Methods, ideas, or system contributions that make the work stand out.

permutation-based learning
ranking filters
non-gradient optimization
ordinal representation learning
Arrow's impossibility theorem
🔎 Similar Papers
No similar papers found.