Isotonic Mechanism for Exponential Family Estimation in Machine Learning Peer Review

📅 2023-04-21
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the underutilization of author-provided multi-paper quality rankings in machine learning conference reviewing. We propose a novel isotonic calibration method that generalizes classical isotonic regression to exponential-family distributions, producing calibrated scores that both faithfully preserve original review scores and strictly satisfy author-specified pairwise ranking constraints—without assuming a parametric distribution. Theoretical contributions include: (1) the first distribution-free isotonic calibration framework for exponential families; (2) a proof that authors are incentivized to truthfully report rankings under convex additive utility; and (3) a demonstration that pairwise comparisons suffice for information-theoretically optimal truthful mechanism design. Empirically, under a bounded total variation assumption, our method significantly improves quality estimation accuracy: experiments on ICML 2023 data show substantially better approximation of ground-truth paper quality compared to baselines.
📝 Abstract
In 2023, the International Conference on Machine Learning (ICML) required authors with multiple submissions to rank their submissions based on perceived quality. In this paper, we aim to employ these author-specified rankings to enhance peer review in machine learning and artificial intelligence conferences by extending the Isotonic Mechanism to exponential family distributions. This mechanism generates adjusted scores that closely align with the original scores while adhering to author-specified rankings. Despite its applicability to a broad spectrum of exponential family distributions, implementing this mechanism does not require knowledge of the specific distribution form. We demonstrate that an author is incentivized to provide accurate rankings when her utility takes the form of a convex additive function of the adjusted review scores. For a certain subclass of exponential family distributions, we prove that the author reports truthfully only if the question involves only pairwise comparisons between her submissions, thus indicating the optimality of ranking in truthful information elicitation. Moreover, we show that the adjusted scores improve dramatically the estimation accuracy compared to the original scores and achieve nearly minimax optimality when the ground-truth scores have bounded total variation. We conclude with a numerical analysis of the ICML 2023 ranking data, showing substantial estimation gains in approximating a proxy ground-truth quality of the papers using the Isotonic Mechanism.
Problem

Research questions and friction points this paper is trying to address.

Extend Isotonic Mechanism to exponential family distributions
Enhance peer review with author-specified rankings
Improve estimation accuracy of paper quality
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends Isotonic Mechanism to exponential distributions
Generates adjusted scores aligning with author rankings
Improves estimation accuracy with minimax optimality
🔎 Similar Papers
No similar papers found.