DimGrow: Memory-Efficient Field-level Embedding Dimension Search

📅 2025-05-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In recommendation systems, heterogeneous feature fields exhibit diverse requirements for embedding dimensions, necessitating automated dimension allocation; however, existing hypernetwork-based neural architecture search (NAS) or pruning methods incur prohibitive memory overhead, limiting scalability to large-scale feature spaces. This paper proposes a hypernetwork-free, progressive embedding dimension search framework: starting from a uniform low-dimensional initialization, it dynamically adjusts the embedding dimension of each field based on learnable importance scores, employing a threshold-driven mechanism for lightweight, adaptive expansion and contraction. By avoiding exhaustive enumeration of dimension combinations, the method drastically reduces training memory consumption—up to 83%—while matching the accuracy of hypernetwork-based baselines on three mainstream recommendation benchmarks. The approach achieves a principled trade-off between model accuracy and computational efficiency, enabling scalable, adaptive embedding dimension optimization.

Technology Category

Application Category

📝 Abstract
Key feature fields need bigger embedding dimensionality, others need smaller. This demands automated dimension allocation. Existing approaches, such as pruning or Neural Architecture Search (NAS), require training a memory-intensive SuperNet that enumerates all possible dimension combinations, which is infeasible for large feature spaces. We propose DimGrow, a lightweight approach that eliminates the SuperNet requirement. Starting training model from one dimension per feature field, DimGrow can progressively expand/shrink dimensions via importance scoring. Dimensions grow only when their importance consistently exceed a threshold, ensuring memory efficiency. Experiments on three recommendation datasets verify the effectiveness of DimGrow while it reduces training memory compared to SuperNet-based methods.
Problem

Research questions and friction points this paper is trying to address.

Automate embedding dimension allocation for feature fields
Reduce memory usage in embedding dimension search
Eliminate SuperNet requirement in dimension optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Eliminates SuperNet for memory efficiency
Progressively adjusts embedding dimensions dynamically
Uses importance scoring for dimension allocation
🔎 Similar Papers
No similar papers found.