🤖 AI Summary
Existing EDA research primarily focuses on gate-level digital circuits, lacking the capability to model device-level characteristics essential for analog and mixed-signal circuits. Method: We propose DICE, the first self-supervised graph neural network tailored for device-level circuit schematics, unifying representation learning across analog, digital, and mixed-signal domains. Our approach (1) constructs device-level circuit topology graphs, (2) introduces two SPICE simulation–agnostic graph augmentation strategies, and (3) enables pretraining via message-passing neural networks and contrastive learning—without requiring circuit simulations. Contribution/Results: DICE achieves significant performance gains across three downstream EDA tasks, demonstrating robust cross-domain generalization. This work establishes the first self-supervised graph representation learning framework for device-level circuits, offering a scalable, simulation-free foundation model paradigm for EDA.
📝 Abstract
Self-supervised graph representation learning has driven significant advancements in domains such as social network analysis, molecular design, and electronics design automation (EDA). However, prior works in EDA have mainly focused on the representation of gate-level digital circuits, failing to capture analog and mixed-signal circuits. To address this gap, we introduce DICE: Device-level Integrated Circuits Encoder, the first self-supervised pretrained graph neural network (GNN) model for any circuit expressed at the device level. DICE is a message-passing neural network (MPNN) trained through graph contrastive learning, and its pretraining process is simulation-free, incorporating two novel data augmentation techniques. Experimental results demonstrate that DICE achieves substantial performance gains across three downstream tasks, underscoring its effectiveness for both analog and digital circuits.