🤖 AI Summary
Existing $k$-core queries on temporal graphs suffer from poor scalability due to combinatorial explosion and computational redundancy—arising from enumerating all sub-intervals and repeating core computations. To address this, we propose CoreT, a novel algorithm that introduces timestamped entry records for vertices and edges into $k$-cores, coupled with dynamic timestamp propagation and inclusion-based interval analysis. This enables efficient pruning of overlapping time intervals, allowing $k$-core identification over any query interval via a single graph traversal. Experiments on multiple large-scale real-world datasets demonstrate that CoreT achieves up to four orders of magnitude speedup over the state-of-the-art OTCD algorithm, significantly enhancing scalability and practicality. CoreT establishes a new, efficient paradigm for dense subgraph analysis in dynamic networks.
📝 Abstract
Querying cohesive subgraphs in temporal graphs is essential for understanding the dynamic structure of real-world networks, such as evolving communities in social platforms, shifting hyperlink structures on the Web, and transient communication patterns in call networks. Recently, research has focused on the temporal $k$-core query, which aims to identify all $k$-cores across all possible time sub-intervals within a given query interval. The state-of-the-art algorithm OTCD mitigates redundant computations over overlapping sub-intervals by exploiting inclusion relationships among $k$-cores in different time intervals. Nevertheless, OTCD remains limited in scalability due to the combinatorial growth in interval enumeration and repeated processing. In this paper, we revisit the temporal $k$-core query problem and introduce a novel algorithm CoreT, which dynamically records the earliest timestamp at which each vertex or edge enters a $k$-core. This strategy enables substantial pruning of redundant computations. As a result, CoreT requires only a single pass over the query interval and achieves improved time complexity, which is linear in both the number of temporal edges within the query interval and the duration of the interval, making it highly scalable for long-term temporal analysis. Experimental results on large real-world datasets show that CoreT achieves up to four orders of magnitude speedup compared to the existing state-of-the-art OTCD, demonstrating its effectiveness and scalability for temporal $k$-core analysis.