🤖 AI Summary
This work investigates the topological structure of activation patterns in ReLU neural networks and its intrinsic relationship with model behavior. For binary classification, we propose characterizing the geometry of decision boundaries via Fiedler partitioning on the dual graph of the piecewise-linear partition induced by the network. For regression, we introduce algebraic topological tools—specifically, homology group computation—to quantify the complexity of the polyhedral cell decomposition; we observe a strong dynamic correlation between training loss and the number of cells. Experiments reveal that the cell count monotonically decreases in a predictable manner during training, and the Fiedler partition closely approximates the true decision boundary. This study establishes, for the first time, a systematic bridge among the piecewise-linear structure of ReLU networks, graph-theoretic partitioning, and algebraic topological invariants—providing a novel geometric framework for understanding generalization in deep networks.
📝 Abstract
This paper explores the topological signatures of ReLU neural network activation patterns. We consider feedforward neural networks with ReLU activation functions and analyze the polytope decomposition of the feature space induced by the network. Mainly, we investigate how the Fiedler partition of the dual graph and show that it appears to correlate with the decision boundary -- in the case of binary classification. Additionally, we compute the homology of the cellular decomposition -- in a regression task -- to draw similar patterns in behavior between the training loss and polyhedral cell-count, as the model is trained.