π€ AI Summary
This study addresses the challenge of users struggling to obtain effective peer support from the overwhelming volume of information in online health communities, which hinders chronic disease management. Employing a two-phase mixed-methods approach (N=165), the research evaluates usersβ perceived value, functional preferences, and conditions for adopting algorithm-driven personalized support groups through surveys and interviews. Findings reveal that 62.8% of participants assigned the highest rating to the proposed system, and 91.5% expressed willingness to join; moreover, peer-matching quality showed a strong positive correlation with perceived value (Ο = 0.764, p < 0.001). The study uncovers a conditional acceptance mechanism toward AI-facilitated peer support, highlighting trust, privacy protection, algorithmic transparency, human oversight, and user control over data as critical prerequisites, thereby offering empirical grounding for human-centered AI design in digital health interventions.
π Abstract
Peer support is critical to managing chronic health conditions. Online health communities (OHCs) enable patients and caregivers to connect with similar others, yet their large scale makes it challenging to find the most relevant peers and content. This study assessed perceived value, preferred features, and acceptance conditions for algorithmically personalized support group formation within OHCs. A two-phase, mixed-methods survey (N=165) examined OHC participation patterns, personalization priorities, and acceptance of a simulated personalized support group. Perceived value of the simulated support group was high (mean 4.55/5; 62.8% rated 5/5) and 91.5% would join this group. The importance participants placed on peer matching strongly correlated with perceived value (\r{ho}=0.764, p<0.001). Qualitative findings revealed conditional acceptance: participants demand security, transparency, human oversight, and user control over data. Personalized support groups may be desired, but they will not be adopted unless trust, privacy, and algorithmic governance concerns are addressed.