π€ AI Summary
Existing semantic drift detection methods rely on fine-tuned BERT models, are restricted to adjacent time periods, and incur high computational costsβmaking them unsuitable for middle-school-level interpretability and scalability.
Method: We propose a lightweight, continuous, and unsupervised approach for modeling word semantic drift. Instead of contextualized embeddings, it constructs a temporal word embedding similarity matrix spanning multiple historical periods and designs a continuous semantic drift analysis framework based on this matrix. Spectral clustering is integrated to automatically identify interpretable drift patterns.
Contribution/Results: The method operates solely on static word vectors, requires no labeled data or model fine-tuning, and enables fine-grained characterization of multi-period semantic evolution on standard corpora. It significantly improves both detection depth and efficiency while reducing computational cost by an order of magnitude.
π Abstract
The meanings and relationships of words shift over time. This phenomenon is referred to as semantic shift. Research focused on understanding how semantic shifts occur over multiple time periods is essential for gaining a detailed understanding of semantic shifts. However, detecting change points only between adjacent time periods is insufficient for analyzing detailed semantic shifts, and using BERT-based methods to examine word sense proportions incurs a high computational cost. To address those issues, we propose a simple yet intuitive framework for how semantic shifts occur over multiple time periods by leveraging a similarity matrix between the embeddings of the same word through time. We compute a diachronic word similarity matrix using fast and lightweight word embeddings across arbitrary time periods, making it deeper to analyze continuous semantic shifts. Additionally, by clustering the similarity matrices for different words, we can categorize words that exhibit similar behavior of semantic shift in an unsupervised manner.