Higher Order Reduced Rank Regression

📅 2025-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional reduced-rank regression (RRR) for multi-response regression is limited by its linear assumption, failing to capture complex nonlinear feature–response relationships. To address this, we propose a higher-order nonlinear RRR framework. Our method introduces multilinear transformations and tensor-valued coefficient modeling into RRR, enforcing higher-order rank constraints via Tucker decomposition and developing an efficient optimization algorithm on the Riemannian manifold of low-multilinear-rank tensors. We establish theoretical convergence guarantees for the proposed Riemannian optimization scheme. Empirical evaluation on multiple benchmark datasets demonstrates that our approach significantly outperforms both linear RRR and state-of-the-art nonlinear baselines, achieving consistent improvements in predictive accuracy and generalization performance.

Technology Category

Application Category

📝 Abstract
Reduced Rank Regression (RRR) is a widely used method for multi-response regression. However, RRR assumes a linear relationship between features and responses. While linear models are useful and often provide a good approximation, many real-world problems involve more complex relationships that cannot be adequately captured by simple linear interactions. One way to model such relationships is via multilinear transformations. This paper introduces Higher Order Reduced Rank Regression (HORRR), an extension of RRR that leverages multi-linear transformations, and as such is capable of capturing nonlinear interactions in multi-response regression. HORRR employs tensor representations for the coefficients and a Tucker decomposition to impose multilinear rank constraints as regularization akin to the rank constraints in RRR. Encoding these constraints as a manifold allows us to use Riemannian optimization to solve this HORRR problems. We theoretically and empirically analyze the use of Riemannian optimization for solving HORRR problems.
Problem

Research questions and friction points this paper is trying to address.

Extends Reduced Rank Regression to capture nonlinear interactions.
Introduces Higher Order Reduced Rank Regression using tensor representations.
Employs Riemannian optimization for solving multilinear rank constraints.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends RRR with multilinear transformations
Uses tensor representations and Tucker decomposition
Employs Riemannian optimization for solving HORRR
🔎 Similar Papers
No similar papers found.