Unconstrained Monotonic Calibration of Predictions in Deep Ranking Systems

📅 2025-04-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing calibration methods for deep ranking models rely on predefined transformation functions (e.g., piecewise-linear), limiting flexibility and expressive capacity in calibrating absolute prediction values. Method: We propose an end-to-end calibration framework based on Unconstrained Monotonic Neural Networks (UMNN)—the first application of UMNN to ranking calibration—combined with a novel Smooth Calibration Loss (SCLoss). This approach enforces strict monotonicity while modeling arbitrarily complex monotonic mappings, eliminating restrictive functional-form assumptions. Contribution/Results: Our method simultaneously preserves ranking order and achieves high representational power. Offline experiments demonstrate significant improvements in calibration accuracy. Deployed in Kuaishou’s large-scale video ranking system, it yields statistically significant gains in key metrics, including click-through rate (CTR) and average watch time per user.

Technology Category

Application Category

📝 Abstract
Ranking models primarily focus on modeling the relative order of predictions while often neglecting the significance of the accuracy of their absolute values. However, accurate absolute values are essential for certain downstream tasks, necessitating the calibration of the original predictions. To address this, existing calibration approaches typically employ predefined transformation functions with order-preserving properties to adjust the original predictions. Unfortunately, these functions often adhere to fixed forms, such as piece-wise linear functions, which exhibit limited expressiveness and flexibility, thereby constraining their effectiveness in complex calibration scenarios. To mitigate this issue, we propose implementing a calibrator using an Unconstrained Monotonic Neural Network (UMNN), which can learn arbitrary monotonic functions with great modeling power. This approach significantly relaxes the constraints on the calibrator, improving its flexibility and expressiveness while avoiding excessively distorting the original predictions by requiring monotonicity. Furthermore, to optimize this highly flexible network for calibration, we introduce a novel additional loss function termed Smooth Calibration Loss (SCLoss), which aims to fulfill a necessary condition for achieving the ideal calibration state. Extensive offline experiments confirm the effectiveness of our method in achieving superior calibration performance. Moreover, deployment in Kuaishou's large-scale online video ranking system demonstrates that the method's calibration improvements translate into enhanced business metrics. The source code is available at https://github.com/baiyimeng/UMC.
Problem

Research questions and friction points this paper is trying to address.

Calibrating absolute values in deep ranking systems
Overcoming inflexibility of predefined calibration functions
Enhancing calibration with unconstrained monotonic neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unconstrained Monotonic Neural Network for calibration
Smooth Calibration Loss optimizes flexible networks
Enhances ranking system accuracy and business metrics
🔎 Similar Papers
No similar papers found.
Yimeng Bai
Yimeng Bai
University of Science and Technology of China
RecommendationGenerative RecommendationLarge Language Model
Shunyu Zhang
Shunyu Zhang
Kuaishou, Beihang University
Y
Yang Zhang
National University of Singapore
H
Hu Liu
Kuaishou Technology
Wentian Bao
Wentian Bao
Alibaba Group
Recommender SystemInformation Retrieval
E
Enyun Yu
Northeasten University
F
Fuli Feng
University of Science and Technology of China
W
Wenwu Ou
Unaffiliated