On dimension reduction in conditional dependence models

📅 2025-05-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Estimating the conditional central subspace (CCS) of a response vector $mathbf{Y}$ given a high-dimensional covariate vector $mathbf{X}$ is challenging due to complex conditional dependence structures. Method: We propose a decomposable theoretical framework that orthogonally decomposes the CCS into the marginal central subspace and a newly introduced Copula central subspace, respectively capturing marginal effects and pure dependence structure. To avoid restrictive assumptions—particularly the linear conditional mean assumption inherent in classical sufficient dimension reduction methods (e.g., SIR)—we formally define the Copula central subspace and develop a joint estimation procedure based on adaptive nonparametric kernel estimation. Contribution/Results: We establish consistency and optimal convergence rates for the proposed estimator under mild regularity conditions. Simulation studies demonstrate substantial improvements over existing methods, especially in high-dimensional sparse settings. This work provides both conceptual advancement—via the Copula central subspace—and practical utility through a theoretically grounded, assumption-lean estimation strategy.

Technology Category

Application Category

📝 Abstract
Inference of the conditional dependence structure is challenging when many covariates are present. In numerous applications, only a low-dimensional projection of the covariates influences the conditional distribution. The smallest subspace that captures this effect is called the central subspace in the literature. We show that inference of the central subspace of a vector random variable $mathbf Y$ conditioned on a vector of covariates $mathbf X$ can be separated into inference of the marginal central subspaces of the components of $mathbf Y$ conditioned on $mathbf X$ and on the copula central subspace, that we define in this paper. Further discussion addresses sufficient dimension reduction subspaces for conditional association measures. An adaptive nonparametric method is introduced for estimating the central dependence subspaces, achieving parametric convergence rates under mild conditions. Simulation studies illustrate the practical performance of the proposed approach.
Problem

Research questions and friction points this paper is trying to address.

Reducing dimensionality in conditional dependence models
Identifying central subspace for conditional distribution influence
Estimating central dependence subspaces with nonparametric methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Separates central subspace inference into marginal and copula components
Introduces adaptive nonparametric method for subspace estimation
Achieves parametric convergence rates under mild conditions
🔎 Similar Papers
No similar papers found.