🤖 AI Summary
Real-time bidding (RTB) traffic filtering requires efficient low-quality traffic rejection to ensure downstream data quality, yet faces two key challenges: insufficient accuracy in traffic quality assessment and unreliable model confidence estimates; moreover, conventional uncertainty modeling suffers from poor latency due to repeated inference. This paper proposes a unified framework integrating multi-objective learning with lightweight uncertainty modeling. We introduce knowledge distillation into Bayesian uncertainty estimation for the first time—enabling accurate prediction and well-calibrated confidence while drastically reducing computational overhead. Our approach jointly optimizes CTR, CVR, and traffic quality scoring; models predictive uncertainty via variational inference; and compresses the uncertainty inference pathway through teacher–student distillation. Evaluated on the JD.com advertising dataset, the distilled model achieves a 10× speedup in inference latency, alongside simultaneous improvements in AUC and expected calibration error (ECE), thereby achieving co-optimization of efficiency and reliability.
📝 Abstract
Real-Time Auction (RTA) Interception aims to filter out invalid or irrelevant traffic to enhance the integrity and reliability of downstream data. However, two key challenges remain: (i) the need for accurate estimation of traffic quality together with sufficiently high confidence in the model's predictions, typically addressed through uncertainty modeling, and (ii) the efficiency bottlenecks that such uncertainty modeling introduces in real-time applications due to repeated inference. To address these challenges, we propose DAUM, a joint modeling framework that integrates multi-objective learning with uncertainty modeling, yielding both traffic quality predictions and reliable confidence estimates. Building on DAUM, we further apply knowledge distillation to reduce the computational overhead of uncertainty modeling, while largely preserving predictive accuracy and retaining the benefits of uncertainty estimation. Experiments on the JD advertisement dataset demonstrate that DAUM consistently improves predictive performance, with the distilled model delivering a tenfold increase in inference speed.