Transformer-Based Temporal Information Extraction and Application: A Review

📅 2025-04-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the absence of a systematic survey on Transformer-based approaches for temporal information extraction (Temporal IE). It presents the first comprehensive review of over 50 key works published between 2018 and 2024. Methodologically, it establishes a unified taxonomy analyzing models along three dimensions: modeling paradigms (e.g., joint event-temporal modeling, temporal token enhancement), task adaptation strategies (e.g., prompt-based fine-tuning, multi-task learning), and evaluation frameworks—covering domains including healthcare, news, and intelligence analysis. The study identifies critical bottlenecks, notably poor cross-domain transferability and weak long-horizon temporal reasoning. It further proposes three scalable future directions: lightweight temporal encoding, structured prompt design, and causal temporal modeling. Collectively, this work provides both theoretical foundations and practical guidelines for enhancing Transformers’ capacity to understand and robustly reason over temporal structures.

Technology Category

Application Category

📝 Abstract
Temporal information extraction (IE) aims to extract structured temporal information from unstructured text, thereby uncovering the implicit timelines within. This technique is applied across domains such as healthcare, newswire, and intelligence analysis, aiding models in these areas to perform temporal reasoning and enabling human users to grasp the temporal structure of text. Transformer-based pre-trained language models have produced revolutionary advancements in natural language processing, demonstrating exceptional performance across a multitude of tasks. Despite the achievements garnered by Transformer-based approaches in temporal IE, there is a lack of comprehensive reviews on these endeavors. In this paper, we aim to bridge this gap by systematically summarizing and analyzing the body of work on temporal IE using Transformers while highlighting potential future research directions.
Problem

Research questions and friction points this paper is trying to address.

Extracting structured temporal data from unstructured text
Applying Transformer models to improve temporal information extraction
Reviewing advancements and future directions in temporal IE
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer-based models for temporal information extraction
Pre-trained language models enhancing temporal reasoning
Systematic review of temporal IE with Transformers
🔎 Similar Papers
No similar papers found.