🤖 AI Summary
This study addresses the cross-ideological identification and early prediction of online radicalization, focusing on the “incel” community as a prototypical case. Methodologically, it integrates natural language processing, behavioral sequence modeling, and multidimensional psychosocial feature extraction, combining supervised learning with longitudinal user history analysis to quantify individual and collective radicalization tendencies from community discourse. It introduces the first generalizable “Eleven-Factor Radicalization” psychosocial model, transcending traditional single-ideology or static-trait paradigms. Empirical evaluation demonstrates that the model predicts user migration to extremist forums up to 10 months in advance (AUC > 0.6) and achieves near-perfect discrimination (AUC ≈ 0.9) 3–4 months prior. It also significantly distinguishes extremist forum participants from general online users. The work provides a transferable, unified modeling framework for cross-ideological radicalization risk detection and early warning.
📝 Abstract
The proliferation of ideological movements into extremist factions via social media has become a global concern. While radicalization has been studied extensively within the context of specific ideologies, our ability to accurately characterize extremism in more generalizable terms remains underdeveloped. In this paper, we propose a novel method for extracting and analyzing extremist discourse across a range of online community forums. By focusing on verbal behavioral signatures of extremist traits, we develop a framework for quantifying extremism at both user and community levels. Our research identifies 11 distinct factors, which we term ``The Extremist Eleven,'' as a generalized psychosocial model of extremism. Applying our method to various online communities, we demonstrate an ability to characterize ideologically diverse communities across the 11 extremist traits. We demonstrate the power of this method by analyzing user histories from members of the incel community. We find that our framework accurately predicts which users join the incel community up to 10 months before their actual entry with an AUC of $>0.6$, steadily increasing to AUC ~0.9 three to four months before the event. Further, we find that upon entry into an extremist forum, the users tend to maintain their level of extremism within the community, while still remaining distinguishable from the general online discourse. Our findings contribute to the study of extremism by introducing a more holistic, cross-ideological approach that transcends traditional, trait-specific models.