Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data

📅 2025-02-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of jointly modeling multi-scale static topological structures and dynamic evolutionary patterns in time-varying data. We propose Quasi-Zigzag Persistent Homology (QZPH), a novel framework that introduces, for the first time, a theoretically stable and computationally feasible multiparameter Zigzag-type topological invariant. QZPH unifies the characterization of static topology across scales and temporal evolution patterns via zigzag filtration construction, index lattice reduction, and an efficient boundary matrix algorithm. Unlike conventional persistent homology, QZPH explicitly integrates the temporal dimension with multi-resolution geometric information. In sleep staging—a canonical time-series classification task—QZPH significantly improves both classification accuracy and generalization performance across diverse machine learning models. Empirical results demonstrate its superior capacity to capture intrinsic evolutionary dynamics of real-world time-series data.

Technology Category

Application Category

📝 Abstract
In this paper, we propose Quasi Zigzag Persistent Homology (QZPH) as a framework for analyzing time-varying data by integrating multiparameter persistence and zigzag persistence. To this end, we introduce a stable topological invariant that captures both static and dynamic features at different scales. We present an algorithm to compute this invariant efficiently. We show that it enhances the machine learning models when applied to tasks such as sleep-stage detection, demonstrating its effectiveness in capturing the evolving patterns in time-evolving datasets.
Problem

Research questions and friction points this paper is trying to address.

Proposes Quasi Zigzag Persistent Homology for time-varying data analysis
Introduces a stable topological invariant for static and dynamic features
Demonstrates effectiveness in enhancing machine learning models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates multiparameter and zigzag persistence
Introduces stable topological invariant
Efficient algorithm for invariant computation
🔎 Similar Papers
No similar papers found.