MDM: Manhattan Distance Mapping of DNN Weights for Parasitic-Resistance-Resilient Memristive Crossbars

📅 2025-11-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Memristor-based bit-sliced compute-in-memory (CIM) suffers from parasitic resistance (PR) non-idealities, necessitating matrix partitioning across small crossbars—resulting in frequent analog-to-digital conversions, high I/O overhead, increased latency, and area inefficiency. To address this, we propose Manhattan Distance Mapping (MDM): a lightweight, hardware-transparent spatial-aware optimization that exploits bit-level structural sparsity and performs row-wise Manhattan distance reordering to steer highly active weight elements toward crossbar regions less affected by PR. Our method integrates post-training weight mapping, bit-sliced weight storage, activation flow optimization, and row-order reconfiguration. Evaluated on ImageNet-1k, MDM improves average top-1 accuracy by 3.6% across ResNet variants and reduces the non-ideality factor by up to 46%, significantly mitigating analog-domain distortion without hardware modification.

Technology Category

Application Category

📝 Abstract
Manhattan Distance Mapping (MDM) is a post-training deep neural network (DNN) weight mapping technique for memristive bit-sliced compute-in-memory (CIM) crossbars that reduces parasitic resistance (PR) nonidealities. PR limits crossbar efficiency by mapping DNN matrices into small crossbar tiles, reducing CIM-based speedup. Each crossbar executes one tile, requiring digital synchronization before the next layer. At this granularity, designers either deploy many small crossbars in parallel or reuse a few sequentially-both increasing analog-to-digital conversions, latency, I/O pressure, and chip area. MDM alleviates PR effects by optimizing active-memristor placement. Exploiting bit-level structured sparsity, it feeds activations from the denser low-order side and reorders rows according to the Manhattan distance, relocating active cells toward regions less affected by PR and thus lowering the nonideality factor (NF). Applied to DNN models on ImageNet-1k, MDM reduces NF by up to 46% and improves accuracy under analog distortion by an average of 3.6% in ResNets. Overall, it provides a lightweight, spatially informed method for scaling CIM DNN accelerators.
Problem

Research questions and friction points this paper is trying to address.

Reduces parasitic resistance effects in memristive crossbars for DNN acceleration
Optimizes memristor placement to mitigate analog nonidealities in compute-in-memory systems
Improves neural network accuracy under analog distortion through spatial weight mapping
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimizes memristor placement to reduce parasitic resistance
Uses Manhattan distance to relocate active cells
Exploits bit-level sparsity for improved analog accuracy
🔎 Similar Papers
No similar papers found.