Maximum Impact with Fewer Features: Efficient Feature Selection for Cold-Start Recommenders through Collaborative Importance Weighting

📅 2025-08-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In cold-start recommendation, auxiliary features often contain redundancy and noise, degrading model performance and increasing computational overhead. To address this, we propose a user-behavior-prioritized efficient feature selection method. First, we model user-item interactions and auxiliary feature correlations via hybrid matrix factorization, enhanced by collaborative behavioral relevance to improve feature representation. Second, we integrate the Maximum Volume (MaxVol) criterion to assess and rank feature importance—marking the first integration of volume maximization with hybrid decomposition for cold-start feature selection. Extensive experiments across multiple datasets and recommendation models demonstrate that our method retains high accuracy while retaining only 5%–10% of features, significantly outperforming existing feature selection baselines. Moreover, it reduces memory consumption and training time, achieving both effectiveness and efficiency.

Technology Category

Application Category

📝 Abstract
Cold-start challenges in recommender systems necessitate leveraging auxiliary features beyond user-item interactions. However, the presence of irrelevant or noisy features can degrade predictive performance, whereas an excessive number of features increases computational demands, leading to higher memory consumption and prolonged training times. To address this, we propose a feature selection strategy that prioritizes the user behavioral information. Our method enhances the feature representation by incorporating correlations from collaborative behavior data using a hybrid matrix factorization technique and then ranks features using a mechanism based on the maximum volume algorithm. This approach identifies the most influential features, striking a balance between recommendation accuracy and computational efficiency. We conduct an extensive evaluation across various datasets and hybrid recommendation models, demonstrating that our method excels in cold-start scenarios by selecting minimal yet highly effective feature subsets. Even under strict feature reduction, our approach surpasses existing feature selection techniques while maintaining superior efficiency.
Problem

Research questions and friction points this paper is trying to address.

Address cold-start challenges in recommender systems
Reduce irrelevant or noisy features for better performance
Balance recommendation accuracy and computational efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid matrix factorization for feature enhancement
Maximum volume algorithm for feature ranking
Minimal effective feature subsets for cold-start
🔎 Similar Papers