๐ค AI Summary
This work addresses the challenges of training instability and high parameter complexity in hyperconnected neural networks by proposing a novel approach based on Kronecker product structure and manifold constraints. Specifically, high-dimensional doubly stochastic residual matrices are decomposed into Kronecker products of multiple low-dimensional doubly stochastic matrices, with Birkhoff polytope manifold constraints imposed across tensor modes to enforce strict doubly stochasticity. This formulation achieves the first efficient parameterization that simultaneously guarantees exact doubly stochastic properties and significantly reduces parameter complexity from O(nยณC) or O(nCยทn!) down to O(nยฒC). The method not only matches or surpasses the performance of existing mHC variants but also substantially enhances model scalability and training stability.
๐ Abstract
The success of Hyper-Connections (HC) in neural networks (NN) has also highlighted issues related to its training instability and restricted scalability. The Manifold-Constrained Hyper-Connections (mHC) mitigate these challenges by projecting the residual connection space onto a Birkhoff polytope, however, it faces two issues: 1) its iterative Sinkhorn-Knopp (SK) algorithm does not always yield exact doubly stochastic residual matrices; 2) mHC incurs a prohibitive $\mathcal{O}(n^3C)$ parameter complexity with $n$ as the width of the residual stream and $C$ as the feature dimension. The recently proposed mHC-lite reparametrizes the residual matrix via the Birkhoff-von-Neumann theorem to guarantee double stochasticity, but also faces a factorial explosion in its parameter complexity, $\mathcal{O} \left( nC \cdot n! \right)$. To address both challenges, we propose \textbf{KromHC}, which uses the \underline{Kro}necker products of smaller doubly stochastic matrices to parametrize the residual matrix in \underline{mHC}. By enforcing manifold constraints across the factor residual matrices along each mode of the tensorized residual stream, KromHC guarantees exact double stochasticity of the residual matrices while reducing parameter complexity to $\mathcal{O}(n^2C)$. Comprehensive experiments demonstrate that KromHC matches or even outperforms state-of-the-art (SOTA) mHC variants, while requiring significantly fewer trainable parameters. The code is available at \texttt{https://github.com/wz1119/KromHC}.