Optimal estimation for regression discontinuity design with binary outcomes

📅 2025-09-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of estimating treatment effects in regression discontinuity designs (RDD) with bounded—particularly binary—outcomes under finite samples. We propose the first exact finite-sample minimax-optimal linear shrinkage estimator for this setting. Under the assumption that the conditional mean function belongs to a Lipschitz class, our method solves a convex optimization problem in (n+1) dimensions subject to nonnegative weight constraints, requiring only calibration of the Lipschitz constant and avoiding large-sample approximations. Our key contributions are: (i) the first analytically optimal finite-sample estimator for RDD; and (ii) a unified, uniformly valid confidence interval construction. Simulations demonstrate substantial reductions in mean-squared error and narrower confidence intervals relative to conventional approaches. In empirical applications featuring multiple cutoffs—where standard methods often fail—our estimator delivers robust inference.

Technology Category

Application Category

📝 Abstract
We develop a finite-sample optimal estimator for regression discontinuity designs when the outcomes are bounded, including binary outcomes as the leading case. Our finite-sample optimal estimator achieves the exact minimax mean squared error among linear shrinkage estimators with nonnegative weights when the regression function of a bounded outcome lies in a Lipschitz class. Although the original minimax problem involves an iterating (n+1)-dimensional non-convex optimization problem where n is the sample size, we show that our estimator is obtained by solving a convex optimization problem. A key advantage of our estimator is that the Lipschitz constant is the only tuning parameter. We also propose a uniformly valid inference procedure without a large-sample approximation. In a simulation exercise for small samples, our estimator exhibits smaller mean squared errors and shorter confidence intervals than conventional large-sample techniques which may be unreliable when the effective sample size is small. We apply our method to an empirical multi-cutoff design where the sample size for each cutoff is small. In the application, our method yields informative confidence intervals, in contrast to the leading large-sample approach.
Problem

Research questions and friction points this paper is trying to address.

Develops optimal estimator for regression discontinuity with binary outcomes
Solves minimax mean squared error for bounded outcomes in small samples
Provides valid inference without relying on large-sample approximations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Finite-sample optimal estimator for bounded outcomes
Convex optimization replaces non-convex minimax problem
Single Lipschitz constant tuning parameter for inference
🔎 Similar Papers
No similar papers found.