🤖 AI Summary
This paper addresses the nonparametric estimation of nonlinear multivariate Hawkes processes, where interaction functions reside in a reproducing kernel Hilbert space (RKHS) and the conditional intensity is truncated by the ReLU function to jointly model excitation, inhibition, and hybrid effects (e.g., neuronal refractoriness). To tackle the non-differentiability and non-convexity induced by ReLU, we establish— for the first time—the approximate representation theorem for ReLU-truncated Hawkes processes. We then propose a provably convergent two-step RKHS-based estimation method: first approximating the ReLU nonlinearity, then approximating the integral operator. Theoretical analysis provides upper bounds on statistical error and guarantees asymptotic consistency. Experiments on synthetic data and real neuronal spike trains demonstrate that our method achieves significantly higher accuracy than existing nonparametric Hawkes estimators, while maintaining statistical robustness and modeling flexibility.
📝 Abstract
This paper addresses nonparametric estimation of nonlinear multivariate Hawkes processes, where the interaction functions are assumed to lie in a reproducing kernel Hilbert space (RKHS). Motivated by applications in neuroscience, the model allows complex interaction functions, in order to express exciting and inhibiting effects, but also a combination of both (which is particularly interesting to model the refractory period of neurons), and considers in return that conditional intensities are rectified by the ReLU function. The latter feature incurs several methodological challenges, for which workarounds are proposed in this paper. In particular, it is shown that a representer theorem can be obtained for approximated versions of the log-likelihood and the least-squares criteria. Based on it, we propose an estimation method, that relies on two simple approximations (of the ReLU function and of the integral operator). We provide an approximation bound, justifying the negligible statistical effect of these approximations. Numerical results on synthetic data confirm this fact as well as the good asymptotic behavior of the proposed estimator. It also shows that our method achieves a better performance compared to related nonparametric estimation techniques and suits neuronal applications.