🤖 AI Summary
Detecting payment fraud in real-time banking transaction streams poses significant challenges due to the irregular temporal dynamics and complex sequential dependencies inherent in such data.
Method: This paper proposes a sequence-based approach that jointly models temporal order and irregular inter-event intervals. Its core innovation lies in a dedicated time encoder and a learnable positional encoder, which collaboratively capture absolute timestamps and relative event intervals to enhance sensitivity to dynamic temporal patterns. Built upon the GPT architecture, the model is trained end-to-end on large-scale, industrial-grade transaction sequences.
Contribution/Results: Experiments on real-world banking data demonstrate substantial improvements over strong baselines—including logistic regression, XGBoost, and LightGBM—achieving state-of-the-art AUROC and PRAUC scores. These results empirically validate the effectiveness of explicit temporal modeling for identifying sophisticated, time-sensitive fraud patterns.
📝 Abstract
Detecting payment fraud in real-world banking streams requires models that can exploit both the order of events and the irregular time gaps between them. We introduce FraudTransformer, a sequence model that augments a vanilla GPT-style architecture with (i) a dedicated time encoder that embeds either absolute timestamps or inter-event values, and (ii) a learned positional encoder that preserves relative order. Experiments on a large industrial dataset -- tens of millions of transactions and auxiliary events -- show that FraudTransformer surpasses four strong classical baselines (Logistic Regression, XGBoost and LightGBM) as well as transformer ablations that omit either the time or positional component. On the held-out test set it delivers the highest AUROC and PRAUC.