Algorithms with Calibrated Machine Learning Predictions

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of instance-level uncertainty modeling in machine learning predictions for online algorithm design. Methodologically, it is the first to systematically integrate probabilistic calibration—such as Platt scaling and isotonic regression—as a foundation for uncertainty quantification into classical online problems, including ski-rental and online job scheduling. It introduces a calibration-driven competitive ratio analysis framework that yields prediction-confidence-dependent theoretical guarantees. Theoretically, it establishes a quantitative relationship between calibration quality and competitive ratio performance, proving superiority over conventional uncertainty estimation—particularly under high-variance prediction regimes. Empirically, the proposed algorithms significantly outperform baselines on real-world job scheduling datasets and achieve optimal prediction-dependent performance in the ski-rental problem. Crucially, the theoretical guarantees align closely with empirical results, demonstrating both rigor and practical efficacy.

Technology Category

Application Category

📝 Abstract
The field of algorithms with predictions incorporates machine learning advice in the design of online algorithms to improve real-world performance. While this theoretical framework often assumes uniform reliability across all predictions, modern machine learning models can now provide instance-level uncertainty estimates. In this paper, we propose calibration as a principled and practical tool to bridge this gap, demonstrating the benefits of calibrated advice through two case studies: the ski rental and online job scheduling problems. For ski rental, we design an algorithm that achieves optimal prediction-dependent performance and prove that, in high-variance settings, calibrated advice offers more effective guidance than alternative methods for uncertainty quantification. For job scheduling, we demonstrate that using a calibrated predictor leads to significant performance improvements over existing methods. Evaluations on real-world data validate our theoretical findings, highlighting the practical impact of calibration for algorithms with predictions.
Problem

Research questions and friction points this paper is trying to address.

Incorporates machine learning advice in online algorithms
Uses calibration to bridge reliability gap in predictions
Improves performance in ski rental and job scheduling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Calibrated machine learning predictions
Instance-level uncertainty estimates
Optimal prediction-dependent performance
🔎 Similar Papers
No similar papers found.