Self-supervised contrastive learning performs non-linear system identification

📅 2024-10-18
🏛️ International Conference on Learning Representations
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenging problem of identifying latent dynamical systems from nonlinear observations. We propose Dynamics Contrastive Learning (DCL), the first framework to theoretically establish that self-supervised contrastive learning enables identifiable recovery of latent dynamics. Methodologically, DCL operates in a fully unsupervised manner—requiring neither labels nor prior assumptions about dynamical structure—and disentangles linear, switched-linear, and nonlinear (including chaotic) latent dynamics by constructing dynamics-consistent positive and negative sample pairs directly from nonlinear observational data. Key contributions include: (1) establishing a rigorous theoretical connection between self-supervised learning and causal generative factor disentanglement; (2) providing the first identifiability guarantee for self-supervised learning–driven system identification; and (3) demonstrating high-fidelity reconstruction across diverse dynamical regimes on both synthetic and benchmark dynamical datasets.

Technology Category

Application Category

📝 Abstract
Self-supervised learning (SSL) approaches have brought tremendous success across many tasks and domains. It has been argued that these successes can be attributed to a link between SSL and identifiable representation learning: Temporal structure and auxiliary variables ensure that latent representations are related to the true underlying generative factors of the data. Here, we deepen this connection and show that SSL can perform system identification in latent space. We propose dynamics contrastive learning, a framework to uncover linear, switching linear and non-linear dynamics under a non-linear observation model, give theoretical guarantees and validate them empirically.
Problem

Research questions and friction points this paper is trying to address.

Self-supervised learning identifies non-linear system dynamics
Uncovering linear and non-linear dynamics in latent space
Theoretical guarantees for dynamics contrastive learning framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-supervised contrastive learning for system identification
Dynamics contrastive learning uncovers non-linear dynamics
Theoretical guarantees and empirical validation provided
🔎 Similar Papers
No similar papers found.
R
Rodrigo González Laiz
Institute of Computational Biology, Computational Health Center, Helmholtz Munich, Germany
T
Tobias Schmidt
Institute of Computational Biology, Computational Health Center, Helmholtz Munich, Germany
Steffen Schneider
Steffen Schneider
Helmholtz Munich
Dynamical SystemsSelf-Supervised LearningSystems NeuroscienceAI for Science