🤖 AI Summary
This work addresses the high computational complexity of Clebsch–Gordan tensor products in E(3)-equivariant neural networks, which scales as O(L⁶) and has hindered scalability. Existing acceleration techniques often compromise expressive power by omitting critical interaction terms. To overcome this limitation, we propose the first asymptotically fast algorithm that fully preserves expressivity. By leveraging vector spherical harmonics, a generalized Gaunt formula, and convolution of signals in irreducible representations via fast Fourier transforms, our method exactly captures interaction terms neglected by prior approaches. This yields a reduced complexity of O(L⁴ log²L)—a significant improvement over the naive implementation—and closely approaches the theoretical lower bound of O(L⁴).
📝 Abstract
$E(3)$-equivariant neural networks have proven to be effective in a wide range of 3D modeling tasks. A fundamental operation of such networks is the tensor product, which allows interaction between different feature types. Because this operation scales poorly, there has been considerable work towards accelerating this interaction. However, recently \citet{xieprice} have pointed out that most speedups come from a reduction in expressivity rather than true algorithmic improvements on computing Clebsch-Gordan tensor products. A modification of Gaunt tensor product \citep{gaunt} can give a true asymptotic speedup but is incomplete and misses many interactions. In this work, we provide the first complete algorithm which truly provides asymptotic benefits Clebsch-Gordan tensor products. For full CGTP, our algorithm brings runtime complexity from the naive $O(L^6)$ to $O(L^4\log^2 L)$, close to the lower bound of $O(L^4)$. We first show how generalizing fast Fourier based convolution naturally leads to the previously proposed Gaunt tensor product \citep{gaunt}. To remedy antisymmetry issues, we generalize from scalar signals to irrep valued signals, giving us tensor spherical harmonics. We prove a generalized Gaunt formula for the tensor harmonics. Finally, we show that we only need up to vector valued signals to recover the missing interactions of Gaunt tensor product.