Minimally Supervised Hierarchical Domain Intent Learning for CRS

📅 2025-05-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing conversational recommender systems (CRS) struggle with intent modeling under dynamic domain structures, as they rely heavily on labor-intensive manual annotation and lack mechanisms for continuous evolution. Method: This paper proposes a weakly supervised hierarchical intent modeling framework. It introduces a novel attention-driven neural hierarchical clustering algorithm that integrates deep embedded clustering (DEC) and neural attention modeling (NAM), enabling automatic discovery and organization of evolvable, multi-granularity intent hierarchies from minimal user utterances—eliminating the need for frequent retraining. A hierarchical sampling strategy is further incorporated to enhance structural discovery efficiency. Contribution/Results: Evaluated on a 44K-utterance real-world restaurant-domain dataset, the method significantly reduces the number of utterances required to cover dynamic intents, while improving CRS scalability and adaptability in evolving structural scenarios.

Technology Category

Application Category

📝 Abstract
Modeling domain intent within an evolving domain structure presents a significant challenge for domain-specific conversational recommendation systems (CRS). The conventional approach involves training an intent model using utterance-intent pairs. However, as new intents and patterns emerge, the model must be continuously updated while preserving existing relationships and maintaining efficient retrieval. This process leads to substantial growth in utterance-intent pairs, making manual labeling increasingly costly and impractical. In this paper, we propose an efficient solution for constructing a dynamic hierarchical structure that minimizes the number of user utterances required to achieve adequate domain knowledge coverage. To this end, we introduce a neural network-based attention-driven hierarchical clustering algorithm designed to optimize intent grouping using minimal data. The proposed method builds upon and integrates concepts from two existing flat clustering algorithms DEC and NAM, both of which utilize neural attention mechanisms. We apply our approach to a curated subset of 44,000 questions from the business food domain. Experimental results demonstrate that constructing the hierarchy using a stratified sampling strategy significantly reduces the number of questions needed to represent the evolving intent structure. Our findings indicate that this approach enables efficient coverage of dynamic domain knowledge without frequent retraining, thereby enhancing scalability and adaptability in domain-specific CSRs.
Problem

Research questions and friction points this paper is trying to address.

Modeling evolving domain intent hierarchies for conversational recommendation systems
Reducing manual labeling costs in dynamic intent structure learning
Achieving efficient domain coverage with minimal training data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Attention-driven hierarchical clustering algorithm
Minimizes user utterances for domain coverage
Integrates DEC and NAM neural mechanisms
🔎 Similar Papers
No similar papers found.
S
Safikureshi Mondal
San Diego Supercomputer Center, University of California San Diego, USA
Subhasis Dasgupta
Subhasis Dasgupta
SDSC/UCSD
Query ProcessingAccess ControlInformation SecurityKnowledge Graph
A
Amarnath Gupta
San Diego Supercomputer Center, University of California San Diego, USA