A Representer Theorem for Hawkes Processes via Penalized Least Squares Minimization

πŸ“… 2025-10-09
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper addresses the nonparametric estimation of triggering kernels in linear multivariate Hawkes processes, aiming to accurately recover time-varying inter-event interaction structures from asynchronous event sequences. We propose a novel RKHS-based nonparametric estimation framework. Its core contribution is an analytical representation theorem: by defining a transformation kernel via a system of coupled integral equations, we ensure that its dual coefficients are identically oneβ€”thereby bypassing iterative optimization and enabling a closed-form solution. The method integrates penalized least squares with kernel smoothing, yielding direct, theoretically grounded estimates of multidimensional triggering kernels. On synthetic benchmarks, our approach achieves predictive accuracy comparable to state-of-the-art methods while significantly improving computational efficiency and scalability, making it well-suited for large-scale asynchronous event sequence modeling.

Technology Category

Application Category

πŸ“ Abstract
The representer theorem is a cornerstone of kernel methods, which aim to estimate latent functions in reproducing kernel Hilbert spaces (RKHSs) in a nonparametric manner. Its significance lies in converting inherently infinite-dimensional optimization problems into finite-dimensional ones over dual coefficients, thereby enabling practical and computationally tractable algorithms. In this paper, we address the problem of estimating the latent triggering kernels--functions that encode the interaction structure between events--for linear multivariate Hawkes processes based on observed event sequences within an RKHS framework. We show that, under the principle of penalized least squares minimization, a novel form of representer theorem emerges: a family of transformed kernels can be defined via a system of simultaneous integral equations, and the optimal estimator of each triggering kernel is expressed as a linear combination of these transformed kernels evaluated at the data points. Remarkably, the dual coefficients are all analytically fixed to unity, obviating the need to solve a costly optimization problem to obtain the dual coefficients. This leads to a highly efficient estimator capable of handling large-scale data more effectively than conventional nonparametric approaches. Empirical evaluations on synthetic datasets reveal that the proposed method attains competitive predictive accuracy while substantially improving computational efficiency over existing state-of-the-art kernel method-based estimators.
Problem

Research questions and friction points this paper is trying to address.

Estimating latent triggering kernels for multivariate Hawkes processes
Converting infinite-dimensional optimization into finite-dimensional problems
Developing efficient nonparametric estimators via transformed kernel representations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformed kernels defined via integral equations
Optimal estimator as linear combination of transformed kernels
Dual coefficients fixed to unity for efficiency
πŸ”Ž Similar Papers
No similar papers found.