CFiCS: Graph-Based Classification of Common Factors and Microcounseling Skills

📅 2025-03-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the fine-grained identification of generic therapeutic factors and micro-level counseling skills in psychotherapy dialogue texts—a challenging task due to context sensitivity and hierarchical nesting. To tackle this, we propose the first inductive graph neural network framework integrating a therapy-concept heterogeneous graph with ClinicalBERT. Specifically, we construct an explicit heterogeneous graph encoding hierarchical concept relationships and jointly leverage ClinicalBERT’s semantic representations and GNNs for inductive node embedding—enabling generalization to unseen, zero-connection samples. Additionally, we design a hierarchical multi-task classification architecture to jointly optimize macro-level factor and micro-level skill recognition. Experiments demonstrate that our method significantly outperforms baselines—including Random Forest, BERT-based multi-task models, and standalone graph models—on both macro- and micro-level F1 scores, particularly improving fine-grained skill classification accuracy. The approach provides an interpretable and generalizable computational foundation for modeling psychotherapeutic processes.

Technology Category

Application Category

📝 Abstract
Common factors and microcounseling skills are critical to the effectiveness of psychotherapy. Understanding and measuring these elements provides valuable insights into therapeutic processes and outcomes. However, automatic identification of these change principles from textual data remains challenging due to the nuanced and context-dependent nature of therapeutic dialogue. This paper introduces CFiCS, a hierarchical classification framework integrating graph machine learning with pretrained contextual embeddings. We represent common factors, intervention concepts, and microcounseling skills as a heterogeneous graph, where textual information from ClinicalBERT enriches each node. This structure captures both the hierarchical relationships (e.g., skill-level nodes linking to broad factors) and the semantic properties of therapeutic concepts. By leveraging graph neural networks, CFiCS learns inductive node embeddings that generalize to unseen text samples lacking explicit connections. Our results demonstrate that integrating ClinicalBERT node features and graph structure significantly improves classification performance, especially in fine-grained skill prediction. CFiCS achieves substantial gains in both micro and macro F1 scores across all tasks compared to baselines, including random forests, BERT-based multi-task models, and graph-based methods.
Problem

Research questions and friction points this paper is trying to address.

Automatic identification of therapeutic change principles from text.
Challenges due to nuanced, context-dependent therapy dialogue.
Hierarchical classification of common factors and counseling skills.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph machine learning with ClinicalBERT embeddings
Heterogeneous graph for therapeutic concepts
Graph neural networks for inductive learning
🔎 Similar Papers
No similar papers found.