Bridging the Divide: End-to-End Sequence-Graph Learning

📅 2025-10-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-world data often exhibit both sequential characteristics (nodes associated with event sequences) and relational structure (edges representing interactions), yet existing methods typically model these aspects in isolation. To address this, we propose BRIDGE—a novel end-to-end framework for joint learning of sequential and graph-structured data. BRIDGE unifies sequence encoding, graph neural networks (GNNs), and a newly designed token-level cross-attention mechanism (TOKENXATTN) to enable fine-grained, event-level neighbor message passing and task-aligned representation learning. Evaluated on Brightkite for social link prediction and Amazon for fraud detection, BRIDGE consistently outperforms static GNNs, temporal graph models, and sequence-only baselines. Our results demonstrate the effectiveness and generalizability of jointly modeling sequential and relational dependencies, establishing BRIDGE as the first unified architecture for end-to-end sequence–graph co-learning.

Technology Category

Application Category

📝 Abstract
Many real-world datasets are both sequential and relational: each node carries an event sequence while edges encode interactions. Existing methods in sequence modeling and graph modeling often neglect one modality or the other. We argue that sequences and graphs are not separate problems but complementary facets of the same dataset, and should be learned jointly. We introduce BRIDGE, a unified end-to-end architecture that couples a sequence encoder with a GNN under a single objective, allowing gradients to flow across both modules and learning task-aligned representations. To enable fine-grained token-level message passing among neighbors, we add TOKENXATTN, a token-level cross-attention layer that passes messages between events in neighboring sequences. Across two settings, friendship prediction (Brightkite) and fraud detection (Amazon), BRIDGE consistently outperforms static GNNs, temporal graph methods, and sequence-only baselines on ranking and classification metrics.
Problem

Research questions and friction points this paper is trying to address.

Jointly modeling sequential and relational data modalities
Enabling token-level message passing between neighboring sequences
Improving performance on friendship prediction and fraud detection tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

End-to-end sequence-graph joint learning architecture
Token-level cross-attention for neighbor message passing
Unified objective coupling sequence encoder with GNN
🔎 Similar Papers
No similar papers found.