FISformer: Replacing Self-Attention with a Fuzzy Inference System in Transformer Models for Time Series Forecasting

πŸ“… 2026-03-23
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work proposes FISFormer, a novel Transformer architecture that integrates fuzzy inference systems (FIS) to address the limitations of conventional dot-product attention in time series forecasting. Traditional Transformers rely on deterministic attention mechanisms, which struggle to capture uncertainty and complex nonlinear dependencies. In contrast, FISFormer replaces self-attention with learnable membership functions and interpretable fuzzy rules, computing query-key weights through fuzzy interactions along the feature dimension. These weights are normalized via Softmax and used to aggregate value features, yielding contextually enriched representations. By inherently modeling uncertainty while maintaining high interpretability, FISFormer achieves state-of-the-art performance across multiple benchmark datasets, demonstrating significant improvements in prediction accuracy, robustness to noise, and model transparency compared to existing Transformer variants.

Technology Category

Application Category

πŸ“ Abstract
Transformers have achieved remarkable progress in time series forecasting, yet their reliance on deterministic dot-product attention limits their capacity to model uncertainty and nonlinear dependencies across multivariate temporal dimensions. To address this limitation, we propose FISFormer, a Fuzzy Inference System-driven Transformer that replaces conventional attention with a FIS Interaction mechanism. In this framework, each query-key pair undergoes a fuzzy inference process for every feature dimension, where learnable membership functions and rule-based reasoning estimate token-wise relational strengths. These FIS-derived interaction weights capture uncertainty and provide interpretable, continuous mappings between tokens. A softmax operation is applied along the token axis to normalize these weights, which are then combined with the corresponding value features through element-wise multiplication to yield the final context-enhanced token representations. This design fuses the interpretability and uncertainty modeling of fuzzy logic with the representational power of Transformers. Extensive experiments on multiple benchmark datasets demonstrate that FISFormer achieves superior forecasting accuracy, noise robustness, and interpretability compared to state-of-the-art Transformer variants, establishing fuzzy inference as an effective alternative to conventional attention mechanisms.
Problem

Research questions and friction points this paper is trying to address.

time series forecasting
uncertainty modeling
nonlinear dependencies
self-attention
multivariate temporal data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fuzzy Inference System
Transformer
Time Series Forecasting
Uncertainty Modeling
Interpretable Attention
πŸ”Ž Similar Papers
No similar papers found.
B
Bulent Haznedar
Computer Engineering Department, Gaziantep University, Gaziantep, TΓΌrkiye
Levent Karacan
Levent Karacan
Assistant Professor of Computer Engineering, Gaziantep University
Computer VisionMachine LearningArtificial Intelligence