🤖 AI Summary
Neural networks lack explicit geometric and topological awareness, leading to fundamental limitations in modeling self-similar structures and multi-scale complexity. Method: We propose CantorNet—the first ReLU neural network family grounded in the ternary construction of the Cantor set, yielding fully analyzable decision boundaries. It systematically encodes self-similarity and multi-scale geometric complexity across the full spectrum of Kolmogorov complexity, via a hierarchical recursive architecture, closed-form boundary derivation, and a unified complexity framework integrating topological (Betti numbers) and geometric (curvature distribution) measures. Contributions/Results: (1) First formalization of a classical fractal as an analyzable neural architecture; (2) Empirical validation that standard complexity metrics exhibit sensitivity and consistency under self-similarity; (3) Identification of a “geometric blind spot” that fundamentally undermines data augmentation efficacy and adversarial robustness—establishing a theoretical benchmark and empirical foundation for geometry-aware modeling.
📝 Abstract
Many natural phenomena are characterized by self-similarity, for example the symmetry of human faces, or a repetitive motif of a song. Studying of such symmetries will allow us to gain deeper insights into the underlying mechanisms of complex systems. Recognizing the importance of understanding these patterns, we propose a geometrically inspired framework to study such phenomena in artificial neural networks. To this end, we introduce emph{CantorNet}, inspired by the triadic construction of the Cantor set, which was introduced by Georg Cantor in the $19^ ext{th}$ century. In mathematics, the Cantor set is a set of points lying on a single line that is self-similar and has a counter intuitive property of being an uncountably infinite null set. Similarly, we introduce CantorNet as a sandbox for studying self-similarity by means of novel topological and geometrical complexity measures. CantorNet constitutes a family of ReLU neural networks that spans the whole spectrum of possible Kolmogorov complexities, including the two opposite descriptions (linear and exponential as measured by the description length). CantorNet's decision boundaries can be arbitrarily ragged, yet are analytically known. Besides serving as a testing ground for complexity measures, our work may serve to illustrate potential pitfalls in geometry-ignorant data augmentation techniques and adversarial attacks.