Self-Supervised Graph Contrastive Pretraining for Device-level Integrated Circuits

📅 2025-02-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing EDA research primarily focuses on gate-level digital circuits, lacking the capability to model device-level characteristics essential for analog and mixed-signal circuits. Method: We propose DICE, the first self-supervised graph neural network tailored for device-level circuit schematics, unifying representation learning across analog, digital, and mixed-signal domains. Our approach (1) constructs device-level circuit topology graphs, (2) introduces two SPICE simulation–agnostic graph augmentation strategies, and (3) enables pretraining via message-passing neural networks and contrastive learning—without requiring circuit simulations. Contribution/Results: DICE achieves significant performance gains across three downstream EDA tasks, demonstrating robust cross-domain generalization. This work establishes the first self-supervised graph representation learning framework for device-level circuits, offering a scalable, simulation-free foundation model paradigm for EDA.

Technology Category

Application Category

📝 Abstract
Self-supervised graph representation learning has driven significant advancements in domains such as social network analysis, molecular design, and electronics design automation (EDA). However, prior works in EDA have mainly focused on the representation of gate-level digital circuits, failing to capture analog and mixed-signal circuits. To address this gap, we introduce DICE: Device-level Integrated Circuits Encoder, the first self-supervised pretrained graph neural network (GNN) model for any circuit expressed at the device level. DICE is a message-passing neural network (MPNN) trained through graph contrastive learning, and its pretraining process is simulation-free, incorporating two novel data augmentation techniques. Experimental results demonstrate that DICE achieves substantial performance gains across three downstream tasks, underscoring its effectiveness for both analog and digital circuits.
Problem

Research questions and friction points this paper is trying to address.

Self-supervised graph representation learning
Device-level integrated circuits
Analog and mixed-signal circuits
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-supervised graph neural network
Message-passing neural network
Graph contrastive learning
🔎 Similar Papers
No similar papers found.
S
Sungyoung Lee
Department of Electrical and Computer Engineering, The University of Texas at Austin
Z
Ziyi Wang
Department of Computer Science and Engineering, The Chinese University of Hong Kong
S
Seunggeun Kim
Department of Electrical and Computer Engineering, The University of Texas at Austin
Taekyun Lee
Taekyun Lee
The University of Texas at Austin
Deep LearningGenerative ModelWireless Communication
David Z. Pan
David Z. Pan
Professor, Silicon Labs Endowed Chair, ECE Dept., University of Texas at Austin
Electronic Design AutomationDesign for ManufacturingVLSIHardwareMachine Learning