🤖 AI Summary
Traditional pre-trained language models (e.g., BERT) lack explicit temporal modeling capabilities, limiting their effectiveness on complex time reasoning tasks. To address this, we propose BiTimeBERT 2.0, a temporally aware language model pretrained on a large-scale chronological news corpus. Our method introduces three novel time-aware pretraining objectives: Event Temporal Alignment Masked Language Modeling (ETAMLM), Document Date Prediction (DD), and Time-Sensitive Entity Replacement (TSER), complemented by an efficient corpus preprocessing strategy that accelerates training by 53%. Evaluated on diverse temporal reasoning benchmarks—including datasets with long temporal spans—BiTimeBERT 2.0 consistently outperforms BERT and other baselines. These results empirically validate that explicit temporal modeling yields substantial gains in linguistic understanding and temporal reasoning capability.
📝 Abstract
In the evolving field of Natural Language Processing (NLP), understanding the temporal context of text is increasingly critical for applications requiring advanced temporal reasoning. Traditional pre-trained language models like BERT, which rely on synchronic document collections such as BookCorpus and Wikipedia, often fall short in effectively capturing and leveraging temporal information. To address this limitation, we introduce BiTimeBERT 2.0, a novel time-aware language model pre-trained on a temporal news article collection. BiTimeBERT 2.0 incorporates temporal information through three innovative pre-training objectives: Extended Time-Aware Masked Language Modeling (ETAMLM), Document Dating (DD), and Time-Sensitive Entity Replacement (TSER). Each objective is specifically designed to target a distinct dimension of temporal information: ETAMLM enhances the model's understanding of temporal contexts and relations, DD integrates document timestamps as explicit chronological markers, and TSER focuses on the temporal dynamics of"Person"entities. Moreover, our refined corpus preprocessing strategy reduces training time by nearly 53%, making BiTimeBERT 2.0 significantly more efficient while maintaining high performance. Experimental results show that BiTimeBERT 2.0 achieves substantial improvements across a broad range of time-related tasks and excels on datasets spanning extensive temporal ranges. These findings underscore BiTimeBERT 2.0's potential as a powerful tool for advancing temporal reasoning in NLP.