🤖 AI Summary
This work addresses the structural mismatch between dense architectures and the high-dimensional sparse inputs prevalent in recommender systems, which limits performance gains as models scale. To overcome this, the authors propose SSR, an explicit sparsity framework that introduces structural sparsity into recommendation models for the first time. SSR employs a multi-view “filter-and-fuse” mechanism: it first applies sparse filtering at the feature dimension level and then densely fuses the retained signals to accommodate input sparsity. The framework incorporates two strategies—static random filtering with fixed sparse subsets and Iterative Competitive Sparsification (ICS), a biologically inspired, differentiable dynamic sparsity method. Experiments demonstrate that SSR consistently outperforms state-of-the-art models across three public benchmarks and Alibaba’s AliExpress billion-scale industrial dataset, achieving sustained performance improvements with increasing model size and breaking through the saturation bottleneck of dense architectures.
📝 Abstract
Recent progress in scaling large models has motivated recommender systems to increase model depth and capacity to better leverage massive behavioral data. However, recommendation inputs are high-dimensional and extremely sparse, and simply scaling dense backbones (e.g., deep MLPs) often yields diminishing returns or even performance degradation. Our analysis of industrial CTR models reveals a phenomenon of implicit connection sparsity: most learned connection weights tend towards zero, while only a small fraction remain prominent. This indicates a structural mismatch between dense connectivity and sparse recommendation data; by compelling the model to process vast low-utility connections instead of valid signals, the dense architecture itself becomes the primary bottleneck to effective pattern modeling. We propose \textbf{SSR} (Explicit \textbf{S}parsity for \textbf{S}calable \textbf{R}ecommendation), a framework that incorporates sparsity explicitly into the architecture. SSR employs a multi-view "filter-then-fuse" mechanism, decomposing inputs into parallel views for dimension-level sparse filtering followed by dense fusion. Specifically, we realize the sparsity via two strategies: a Static Random Filter that achieves efficient structural sparsity via fixed dimension subsets, and Iterative Competitive Sparse (ICS), a differentiable dynamic mechanism that employs bio-inspired competition to adaptively retain high-response dimensions. Experiments on three public datasets and a billion-scale industrial dataset from AliExpress (a global e-commerce platform) show that SSR outperforms state-of-the-art baselines under similar budgets. Crucially, SSR exhibits superior scalability, delivering continuous performance gains where dense models saturate.