Differential Mental Disorder Detection with Psychology-Inspired Multimodal Stimuli

📅 2026-04-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the diagnostic challenge posed by symptom overlap among psychiatric disorders such as depression, anxiety, and schizophrenia. To this end, it proposes a psychology-inspired multimodal stimulation paradigm designed to elicit differential responses across emotional, cognitive, and behavioral dimensions. The authors introduce the first large-scale multimodal mental health dataset (MMH) incorporating clinically validated diagnostic labels. Innovatively guided by psychological theory, the stimulation protocol informs a paradigm-aware multimodal framework that leverages inter-disorder difference priors as semantic cues to enhance task-specific modeling of affective and interactive context. Experimental results demonstrate that the proposed approach significantly outperforms existing baselines in differential diagnosis, thereby validating the efficacy of psychology-driven stimulus design in improving the accuracy of psychiatric disorder identification.
📝 Abstract
Differential diagnosis of mental disorders remains a fundamental challenge in real-world clinical practice, where multiple conditions often exhibit overlapping symptoms. However, most existing public datasets are developed under single-disorder settings and rely on limited data elicitation paradigms, restricting their ability to capture disorder-specific patterns. In this work, we investigate differential mental disorder detection through psychology-inspired multimodal stimuli, designed to elicit diverse emotional, cognitive, and behavioral responses grounded in findings from experimental psychology. Based on this paradigm, we collect a large-scale multimodal mental health dataset (MMH) covering depression, anxiety, and schizophrenia, with all diagnostic labels clinically verified by licensed psychiatrists. To effectively model the heterogeneous signals induced by diverse elicitation tasks, we further propose a paradigm-aware multimodal framework that leverages inter-disorder differences prior knowledge as prompt-guided semantic descriptions to capture task-specific affective and interaction contexts for multimodal representation learning in the new differential mental disorder detection task. Extensive experiments show that our framework consistently outperforms existing baselines, underscoring the value of psychology-inspired stimulus design for differential mental disorder detection.
Problem

Research questions and friction points this paper is trying to address.

differential diagnosis
mental disorders
overlapping symptoms
multimodal stimuli
psychology-inspired
Innovation

Methods, ideas, or system contributions that make the work stand out.

differential diagnosis
psychology-inspired stimuli
multimodal mental health dataset
paradigm-aware framework
prompt-guided representation learning
🔎 Similar Papers
No similar papers found.
Zhiyuan Zhou
Zhiyuan Zhou
PhD student, UC Berkeley
RoboticsReinforcement Learning
Jingjing Wu
Jingjing Wu
Hefei University of Technology
computer visiontrackingperson re-id
Z
Zhibo Lei
Hefei University of Technology
Z
Zhongcheng Yu
Hefei University of Technology
Y
Yuqi Chu
Hefei University of Technology
X
Xiaowei Zhang
Lanzhou University
Q
Qiqi Zhao
Lanzhou University
Qi Wang
Qi Wang
The Hong Kong University of Science and Technology
Scene Generation
S
Shijie Hao
Hefei University of Technology
Y
Yanrong Guo
Hefei University of Technology
Richang Hong
Richang Hong
Hefei University of Technology
MultimediaPattern Recognition