Diagonal Over-parameterization in Reproducing Kernel Hilbert Spaces as an Adaptive Feature Model: Generalization and Adaptivity

📅 2025-01-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional kernel methods in reproducing kernel Hilbert spaces (RKHS) suffer from limited adaptability to target function structures due to fixed kernel functions. Method: We propose a diagonal adaptive kernel model that jointly learns the diagonal eigenvalues of the kernel operator and output coefficients during training. This is the first incorporation of diagonal overparameterization into RKHS, revealing eigenvalue dynamics as an implicit feature learning mechanism; we further establish theoretically that added depth enhances generalization and adaptability via implicit regularization. Results: Experiments demonstrate substantial improvements in generalization, especially under kernel–target mismatch. We prove theoretically that adaptability arises from end-to-end optimization of eigenvalues—providing a novel perspective on how deep kernel methods and neural networks surpass classical kernel-based approaches in structural adaptivity.

Technology Category

Application Category

📝 Abstract
This paper introduces a diagonal adaptive kernel model that dynamically learns kernel eigenvalues and output coefficients simultaneously during training. Unlike fixed-kernel methods tied to the neural tangent kernel theory, the diagonal adaptive kernel model adapts to the structure of the truth function, significantly improving generalization over fixed-kernel methods, especially when the initial kernel is misaligned with the target. Moreover, we show that the adaptivity comes from learning the right eigenvalues during training, showing a feature learning behavior. By extending to deeper parameterization, we further show how extra depth enhances adaptability and generalization. This study combines the insights from feature learning and implicit regularization and provides new perspective into the adaptivity and generalization potential of neural networks beyond the kernel regime.
Problem

Research questions and friction points this paper is trying to address.

Adaptive Kernel Models
Diagonal Adaptation
Self-learning Capability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Diagonal Adaptive Kernel Model
Self-adjusting Kernel Eigenvalues
Enhanced Generalization and Adaptability
🔎 Similar Papers
No similar papers found.