TENDE: Transfer Entropy Neural Diffusion Estimation

📅 2025-10-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing transfer entropy estimation methods suffer from the curse of dimensionality, strong distributional assumptions, and high sample complexity. This paper introduces score-based diffusion models—the first such application—to transfer entropy estimation. By parameterizing the score function with neural networks, our approach implicitly models high-dimensional conditional distributions, enabling end-to-end, distribution-free estimation of transfer entropy. Crucially, it circumvents explicit density modeling and alleviates reliance on large sample sizes, thereby significantly improving scalability and robustness for directed information flow estimation in high-dimensional time series. Extensive experiments on synthetic and real-world datasets demonstrate that our method consistently outperforms state-of-the-art neural estimators and classical approaches in accuracy, stability, and small-sample performance. The proposed framework establishes a novel paradigm for information flow analysis in complex dynamical systems.

Technology Category

Application Category

📝 Abstract
Transfer entropy measures directed information flow in time series, and it has become a fundamental quantity in applications spanning neuroscience, finance, and complex systems analysis. However, existing estimation methods suffer from the curse of dimensionality, require restrictive distributional assumptions, or need exponentially large datasets for reliable convergence. We address these limitations in the literature by proposing TENDE (Transfer Entropy Neural Diffusion Estimation), a novel approach that leverages score-based diffusion models to estimate transfer entropy through conditional mutual information. By learning score functions of the relevant conditional distributions, TENDE provides flexible, scalable estimation while making minimal assumptions about the underlying data-generating process. We demonstrate superior accuracy and robustness compared to existing neural estimators and other state-of-the-art approaches across synthetic benchmarks and real data.
Problem

Research questions and friction points this paper is trying to address.

Estimating transfer entropy with minimal distributional assumptions
Overcoming dimensionality curse in neural information flow measurement
Providing scalable transfer entropy estimation using diffusion models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses score-based diffusion models for estimation
Learns score functions of conditional distributions
Provides flexible scalable transfer entropy estimation
🔎 Similar Papers
2024-04-19Neural Information Processing SystemsCitations: 14
2022-09-02ACM Computing SurveysCitations: 1628