Signature-Kernel Based Evaluation Metrics for Robust Probabilistic and Tail-Event Forecasting

📅 2026-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing evaluation metrics for probabilistic forecasting struggle to model spatiotemporal dependencies and exhibit insensitivity to tail events, thereby undermining the reliability of high-stakes decision-making. To address these limitations, this work proposes the Signature Kernel Maximum Mean Discrepancy (Sig-MMD) and its censored variant (CSig-MMD). The Sig-MMD leverages signature kernels to capture complex dependency structures in multivariate time series, while the censored version enhances sensitivity to tail events through a censoring mechanism—all without compromising the properness of the scoring rule. Together, these methods form a robust framework for multi-step probabilistic forecast evaluation that effectively handles missing data and significantly improves the accuracy of assessing critical tail events.

Technology Category

Application Category

📝 Abstract
Probabilistic forecasting is increasingly critical across high-stakes domains, from finance and epidemiology to climate science. However, current evaluation frameworks lack a consensus metric and suffer from two critical flaws: they often assume independence across time steps or variables, and they demonstrably lack sensitivity to tail events, the very occurrences that are most pivotal in real-world decision-making. To address these limitations, we propose two kernel-based metrics: the signature maximum mean discrepancy (Sig-MMD) and our novel censored Sig-MMD (CSig-MMD). By leveraging the signature kernel, these metrics capture complex inter-variate and inter-temporal dependencies and remain robust to missing data. Furthermore, CSig-MMD introduces a censoring scheme that prioritizes a forecaster's capability to predict tail events while strictly maintaining properness, a vital property for a good scoring rule. These metrics enable a more reliable evaluation of direct multi-step forecasting, facilitating the development of more robust probabilistic algorithms.
Problem

Research questions and friction points this paper is trying to address.

probabilistic forecasting
evaluation metrics
tail events
time dependence
proper scoring
Innovation

Methods, ideas, or system contributions that make the work stand out.

signature kernel
maximum mean discrepancy
tail-event forecasting
proper scoring rules
censored evaluation
🔎 Similar Papers
No similar papers found.