Metric Distortion of Small-group Deliberation

📅 2025-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates distortion in social choice—i.e., the multiplicative approximation ratio between the optimal and selected alternative’s costs—in small groups (≤k agents) operating in a metric space, where agents reach consensus via local discussion and preference aggregation introduces measurement bias. Method: We adopt the metric distortion framework, integrating small-deviation probabilistic analysis with non-convex global optimization to characterize extremal behavior of sums of i.i.d. random variables. Contribution/Results: We prove that three agents suffice to break the deterministic distortion lower bound of 3, and four agents surpass the randomized lower bound of 2.11. We establish the first asymptotically tight group-size upper bound independent of the number of alternatives: achieving distortion 1+ε requires only O(1/ε) agents. Our analysis identifies k=3 and k=4 as critical thresholds, yielding tight distortion bounds and providing both theoretical foundations and scalable design principles for small-group negotiation mechanisms.

Technology Category

Application Category

📝 Abstract
We consider models for social choice where voters rank a set of choices (or alternatives) by deliberating in small groups of size at most $k$, and these outcomes are aggregated by a social choice rule to find the winning alternative. We ground these models in the metric distortion framework, where the voters and alternatives are embedded in a latent metric space, with closer alternative being more desirable for a voter. We posit that the outcome of a small-group interaction optimally uses the voters' collective knowledge of the metric, either deterministically or probabilistically. We characterize the distortion of our deliberation models for small $k$, showing that groups of size $k=3$ suffice to drive the distortion bound below the deterministic metric distortion lower bound of $3$, and groups of size $4$ suffice to break the randomized lower bound of $2.11$. We also show nearly tight asymptotic distortion bounds in the group size, showing that for any constant $epsilon>0$, achieving a distortion of $1+epsilon$ needs group size that only depends on $1/epsilon$, and not the number of alternatives. We obtain these results via formulating a basic optimization problem in small deviations of the sum of $i.i.d.$ random variables, which we solve to global optimality via non-convex optimization. The resulting bounds may be of independent interest in probability theory.
Problem

Research questions and friction points this paper is trying to address.

Decision Bias
Team Size
Optimization of Discussion Rules
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decision Bias Reduction
Small Group Discussion Model
Metric Distortion Framework
🔎 Similar Papers
No similar papers found.