Fusing Global and Local: Transformer-CNN Synergy for Next-Gen Current Estimation

📅 2025-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional signal-line current waveform prediction relies on fixed simplified models and SPICE-based iterative simulation, suffering from low accuracy and high computational cost. To address this, this paper proposes a model-free, simulation-free end-to-end deep learning method. Our approach innovatively integrates Transformer architectures—capable of capturing long-range temporal dependencies—with CNNs—effective at extracting local waveform features—and introduces hardware-aware temporal embeddings to directly predict high-fidelity current responses from input excitations. Crucially, the method eliminates Newton-Raphson iterations and equivalent-circuit modeling entirely. Evaluated across technology nodes from 40 nm to 3 nm, it achieves a mean prediction error of only 0.0098 and accelerates computation by over 1000× compared to SPICE. This enables efficient, high-accuracy power-timing co-analysis in modern VLSI design.

Technology Category

Application Category

📝 Abstract
This paper presents a hybrid model combining Transformer and CNN for predicting the current waveform in signal lines. Unlike traditional approaches such as current source models, driver linear representations, waveform functional fitting, or equivalent load capacitance methods, our model does not rely on fixed simplified models of standard-cell drivers or RC loads. Instead, it replaces the complex Newton iteration process used in traditional SPICE simulations, leveraging the powerful sequence modeling capabilities of the Transformer framework to directly predict current responses without iterative solving steps. The hybrid architecture effectively integrates the global feature-capturing ability of Transformers with the local feature extraction advantages of CNNs, significantly improving the accuracy of current waveform predictions. Experimental results demonstrate that, compared to traditional SPICE simulations, the proposed algorithm achieves an error of only 0.0098. These results highlight the algorithm's superior capabilities in predicting signal line current waveforms, timing analysis, and power evaluation, making it suitable for a wide range of technology nodes, from 40nm to 3nm.
Problem

Research questions and friction points this paper is trying to address.

Hybrid Transformer-CNN model for current waveform prediction
Replaces complex SPICE iterations with direct sequence modeling
Improves accuracy in timing and power analysis across technology nodes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid Transformer-CNN model for current prediction
Replaces SPICE iterations with direct sequence modeling
Combines global Transformer and local CNN features
🔎 Similar Papers
No similar papers found.