🤖 AI Summary
Estimating discount curves for multiple fixed-income instruments faces challenges including data sparsity, measurement noise, and poor extrapolation performance. Method: We propose a cross-instrument discount curve transfer learning framework. Our approach introduces a novel vector-valued kernel ridge regression model incorporating economic priors—particularly spread smoothness—and designs a custom regularizer. We establish theoretical foundations for vector-valued reproducing kernel Hilbert space (RKHS) norm decomposition and develop a new Gaussian process–based uncertainty quantification mechanism. Using separable kernel construction and multi-task joint modeling, the framework enables robust knowledge transfer across asset classes, including government bonds and interest rate swaps. Contribution/Results: Experiments demonstrate significantly improved extrapolation accuracy, with average confidence interval width reduced by 37%. The method enhances both robustness and generalizability of discount curve estimation under sparse and noisy market data.
📝 Abstract
We propose a framework for transfer learning of discount curves across different fixed-income product classes. Motivated by challenges in estimating discount curves from sparse or noisy data, we extend kernel ridge regression (KR) to a vector-valued setting, formulating a convex optimization problem in a vector-valued reproducing kernel Hilbert space (RKHS). Each component of the solution corresponds to the discount curve implied by a specific product class. We introduce an additional regularization term motivated by economic principles, promoting smoothness of spread curves between product classes, and show that it leads to a valid separable kernel structure. A main theoretical contribution is a decomposition of the vector-valued RKHS norm induced by separable kernels. We further provide a Gaussian process interpretation of vector-valued KR, enabling quantification of estimation uncertainty. Illustrative examples demonstrate that transfer learning significantly improves extrapolation performance and tightens confidence intervals compared to single-curve estimation.