Radon-Nikod'ym Derivative: Re-imagining Anomaly Detection from a Measure Theoretic Perspective

📅 2025-02-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
论文提出基于测度理论的RN-Loss方法,通过将损失函数与Radon-Nikodým导数相乘,提升异常检测性能,并在多领域数据集上验证其优越性。

Technology Category

Application Category

📝 Abstract
Which principle underpins the design of an effective anomaly detection loss function? The answer lies in the concept of nthm{} theorem, a fundamental concept in measure theory. The key insight is -- Multiplying the vanilla loss function with the nthm{} derivative improves the performance across the board. We refer to this as RN-Loss. This is established using PAC learnability of anomaly detection. We further show that the nthm{} derivative offers important insights into unsupervised clustering based anomaly detections as well. We evaluate our algorithm on 96 datasets, including univariate and multivariate data from diverse domains, including healthcare, cybersecurity, and finance. We show that RN-Derivative algorithms outperform state-of-the-art methods on 68% of Multivariate datasets (based on F-1 scores) and also achieves peak F1-scores on 72% of time series (Univariate) datasets.
Problem

Research questions and friction points this paper is trying to address.

Enhancing anomaly detection loss functions
Applying Radon-Nikodým derivative to improve performance
Outperforming methods in multivariate and univariate datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

RN-Loss enhances anomaly detection
RN-Derivative improves unsupervised clustering
PAC learnability validates anomaly detection
🔎 Similar Papers
No similar papers found.