Characterization and Mitigation of ADC Noise by Reference Tuning in RRAM-Based Compute-In-Memory

📅 2025-02-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In RRAM-based compute-in-memory (CIM), joint noise from analog-to-digital converters (ADCs) and RRAM devices degrades neural network accuracy—particularly exacerbated by read disturbance under high-voltage read operations. Method: This work establishes the first analytical ADC-RRAM joint noise model based on a 40-nm fabricated chip; proposes a modular reference voltage dynamic tuning mechanism and a low-voltage read scheme robust against read disturbance; and validates the approach end-to-end on both supervised learning and temporal reinforcement learning tasks. The methodology integrates fine-grained statistical modeling (fitting mean and standard deviation of high-/low-resistance states), reference voltage configuration optimization, low-voltage read circuit design, and CIM-aware AI task evaluation. Results: Experimental results demonstrate significant inference accuracy improvement: a 37% reduction in reinforcement learning task error, with negligible increase in hardware overhead.

Technology Category

Application Category

📝 Abstract
With the escalating demand for power-efficient neural network architectures, non-volatile compute-in-memory designs have garnered significant attention. However, owing to the nature of analog computation, susceptibility to noise remains a critical concern. This study confronts this challenge by introducing a detailed model that incorporates noise factors arising from both ADCs and RRAM devices. The experimental data is derived from a 40nm foundry RRAM test-chip, wherein different reference voltage configurations are applied, each tailored to its respective module. The mean and standard deviation values of HRS and LRS cells are derived through a randomized vector, forming the foundation for noise simulation within our analytical framework. Additionally, the study examines the read-disturb effects, shedding light on the potential for accuracy deterioration in neural networks due to extended exposure to high-voltage stress. This phenomenon is mitigated through the proposed low-voltage read mode. Leveraging our derived comprehensive fault model from the RRAM test-chip, we evaluate CIM noise impact on both supervised learning (time-independent) and reinforcement learning (time-dependent) tasks, and demonstrate the effectiveness of reference tuning to mitigate noise impacts.
Problem

Research questions and friction points this paper is trying to address.

Mitigate ADC noise in RRAM-based CIM
Analyze read-disturb effects on neural networks
Evaluate noise impact on learning tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

ADC noise mitigation model
RRAM reference voltage tuning
Low-voltage read mode
🔎 Similar Papers
No similar papers found.