Interaction Topological Transformer for Multiscale Learning in Porous Materials

📅 2025-09-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Porous material structure–property relationships exhibit multiscale characteristics—arising from the interplay between local chemical environments and global pore topology—while suffering from sparse and imbalanced labeled data, leading to poor generalization and limited transferability across material families. To address this, we propose a data-efficient unified prediction framework: (i) a novel interactive topological representation integrating graph neural networks with multiscale feature encoding; (ii) Transformer-based attention mechanisms to capture long-range structural dependencies; and (iii) a two-stage paradigm of self-supervised pretraining followed by supervised fine-tuning. Pretrained on large-scale unlabeled porous material datasets, our framework achieves state-of-the-art performance in predicting key properties—including adsorption capacity, diffusion coefficient, and thermodynamic stability—outperforming existing methods, especially under low-data regimes and across diverse material families.

Technology Category

Application Category

📝 Abstract
Porous materials exhibit vast structural diversity and support critical applications in gas storage, separations, and catalysis. However, predictive modeling remains challenging due to the multiscale nature of structure-property relationships, where performance is governed by both local chemical environments and global pore-network topology. These complexities, combined with sparse and unevenly distributed labeled data, hinder generalization across material families. We propose the Interaction Topological Transformer (ITT), a unified data-efficient framework that leverages novel interaction topology to capture materials information across multiple scales and multiple levels, including structural, elemental, atomic, and pairwise-elemental organization. ITT extracts scale-aware features that reflect both compositional and relational structure within complex porous frameworks, and integrates them through a built-in Transformer architecture that supports joint reasoning across scales. Trained using a two-stage strategy, i.e., self-supervised pretraining on 0.6 million unlabeled structures followed by supervised fine-tuning, ITT achieves state-of-the-art, accurate, and transferable predictions for adsorption, transport, and stability properties. This framework provides a principled and scalable path for learning-guided discovery in structurally and chemically diverse porous materials.
Problem

Research questions and friction points this paper is trying to address.

Modeling porous materials' structure-property relationships across multiple scales
Addressing sparse and unevenly distributed labeled data for material families
Capturing both local chemical environments and global pore-network topology
Innovation

Methods, ideas, or system contributions that make the work stand out.

Interaction Topology Transformer for multiscale materials learning
Two-stage training with self-supervised pretraining and fine-tuning
Built-in Transformer integrates compositional and relational features
🔎 Similar Papers
No similar papers found.