🤖 AI Summary
This paper addresses the problem of testing equality of conditional distributions under covariate shift. To tackle this challenge, we propose a general framework that leverages generative modeling and sample splitting to reduce the conditional distribution test to an unconditional one. Building upon this framework, we introduce two novel nonparametric tests: a generative permutation test and a classification accuracy test. Both achieve the optimal minimax lower bound under nonparametric settings and are provably statistically consistent. Our theoretical analysis integrates shifted Rademacher complexity, function approximation theory, and convergence rates of conditional generative models. Extensive experiments on synthetic and real-world datasets demonstrate the effectiveness of our methods and their substantial improvement over existing baselines.
📝 Abstract
In this paper, we propose a general framework for testing the equality of the conditional distributions in a two-sample problem. This problem is most relevant to transfer learning under covariate shift. Our framework is built on neural network-based generative methods and sample splitting techniques by transforming the conditional distribution testing problem into an unconditional one. We introduce two special tests: the generative permutation-based conditional distribution equality test and the generative classification accuracy-based conditional distribution equality test. Theoretically, we establish a minimax lower bound for statistical inference in testing the equality of two conditional distributions under certain smoothness conditions. We demonstrate that the generative permutation-based conditional distribution equality test and its modified version can attain this lower bound precisely or up to some iterated logarithmic factor. Moreover, we prove the testing consistency of the generative classification accuracy-based conditional distribution equality test. We also establish the convergence rate for the learned conditional generator by deriving new results related to the recently-developed offset Rademacher complexity and approximation properties using neural networks. Empirically, we conduct numerical studies including synthetic datasets and two real-world datasets, demonstrating the effectiveness of our approach.