🤖 AI Summary
Existing personalized search methods overlook users’ pre-search consultation behaviors, thus failing to capture their underlying intent. This paper addresses the e-commerce scenario and proposes the first dynamic modeling framework for the “consultation → intent evolution → search” process. We introduce a Mixture-of-Attention Experts (MoAE) mechanism coupled with a dual-alignment framework—integrating contrastive learning and bidirectional attention—to jointly model latent consultation-driven intent, query semantics, user preferences, and item features, while fusing heterogeneous textual sources (consultations, reviews, and product descriptions). Our approach effectively aligns cross-modal semantics, mitigates historical behavioral noise, and bridges category–text semantic gaps. Extensive experiments on both real-world and synthetic datasets demonstrate significant improvements: +12.6% in Recall@10 and +9.8% in NDCG@10 over state-of-the-art methods.
📝 Abstract
Personalized product search aims to retrieve and rank items that match users' preferences and search intent. Despite their effectiveness, existing approaches typically assume that users' query fully captures their real motivation. However, our analysis of a real-world e-commerce platform reveals that users often engage in relevant consultations before searching, indicating they refine intents through consultations based on motivation and need. The implied motivation in consultations is a key enhancing factor for personalized search. This unexplored area comes with new challenges including aligning contextual motivations with concise queries, bridging the category-text gap, and filtering noise within sequence history. To address these, we propose a Motivation-Aware Personalized Search (MAPS) method. It embeds queries and consultations into a unified semantic space via LLMs, utilizes a Mixture of Attention Experts (MoAE) to prioritize critical semantics, and introduces dual alignment: (1) contrastive learning aligns consultations, reviews, and product features; (2) bidirectional attention integrates motivation-aware embeddings with user preferences. Extensive experiments on real and synthetic data show MAPS outperforms existing methods in both retrieval and ranking tasks.