🤖 AI Summary
This paper addresses regression modeling with high-dimensional continuous functional tensor covariates. We propose the first functional tensor regression model that jointly incorporates tensor low-rank structure and functional smoothness constraints. Methodologically, we impose a low-Tucker-rank decomposition coupled with spline-based functional mode-wise smoothness regularization, and develop a theoretically grounded Gauss–Newton algorithm operating on the functional manifold. Theoretical contributions include establishing a dimension-dependent estimation error bound and rigorously proving quadratic convergence of the proposed algorithm. Empirical evaluations demonstrate that our method significantly outperforms existing functional or tensor regression approaches—both in simulated studies and real-world neuroimaging analyses—achieving superior statistical accuracy, interpretability, and computational stability.
📝 Abstract
Tensor regression has attracted significant attention in statistical research. This study tackles the challenge of handling covariates with smooth varying structures. We introduce a novel framework, termed functional tensor regression, which incorporates both the tensor and functional aspects of the covariate. To address the high dimensionality and functional continuity of the regression coefficient, we employ a low Tucker rank decomposition along with smooth regularization for the functional mode. We develop a functional Riemannian Gauss--Newton algorithm that demonstrates a provable quadratic convergence rate, while the estimation error bound is based on the tensor covariate dimension. Simulations and a neuroimaging analysis illustrate the finite sample performance of the proposed method.