Kernel Treatment Effects with Adaptively Collected Data

📅 2025-10-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Under adaptive data collection, the i.i.d. assumption fails and conventional causal inference methods struggle to characterize counterfactual outcome distributions. To address this, this paper introduces the first extension of Kernel Treatment Effect (KTE) estimation to adaptive experimental settings, proposing a distributed causal inference framework grounded in Reproducing Kernel Hilbert Spaces (RKHS). Methodologically, it integrates doubly robust scoring, variance stabilization, and the Hilbert-space martingale central limit theorem to construct an asymptotically normal kernel-distance test statistic—capable of sensitively detecting both mean shifts and higher-order moment discrepancies. The approach provides strong theoretical guarantees, including precise Type-I error control. Empirically, it significantly outperforms adaptive baseline methods that estimate only scalar treatment effects, achieving superior statistical power and robustness.

Technology Category

Application Category

📝 Abstract
Adaptive experiments improve efficiency by adjusting treatment assignments based on past outcomes, but this adaptivity breaks the i.i.d. assumptions that underpins classical asymptotics. At the same time, many questions of interest are distributional, extending beyond average effects. Kernel treatment effects (KTE) provide a flexible framework by representing counterfactual outcome distributions in an RKHS and comparing them via kernel distances. We present the first kernel-based framework for distributional inference under adaptive data collection. Our method combines doubly robust scores with variance stabilization to ensure asymptotic normality via a Hilbert-space martingale CLT, and introduces a sample-fitted stabilized test with valid type-I error. Experiments show it is well calibrated and effective for both mean shifts and higher-moment differences, outperforming adaptive baselines limited to scalar effects.
Problem

Research questions and friction points this paper is trying to address.

Addresses distributional inference challenges in adaptive experiments
Develops kernel-based framework for counterfactual outcome comparisons
Ensures valid statistical inference despite non-i.i.d. adaptive data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses kernel distances in RKHS for distributional effect comparisons
Combines doubly robust scores with variance stabilization
Employs Hilbert-space martingale CLT for asymptotic normality
🔎 Similar Papers
No similar papers found.