Wireless Memory Approximation for Energy-efficient Task-specific IoT Data Retrieval

📅 2025-10-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Dynamic Random-Access Memory (DRAM) refresh during standby incurs substantial energy overhead in Internet-of-Things (IoT) devices, particularly under machine learning (ML) inference workloads. Method: This paper proposes Wireless Memory Activation and Approximation (WMAA), a novel DRAM management scheme integrating task-correlation analysis, temporal access pattern prediction, and wireless-signal-triggered activation. WMAA dynamically identifies model parameter access patterns and transitions DRAM to a low-power approximate state during inactive periods, thereby suppressing unnecessary refresh operations. Contribution/Results: To the best of our knowledge, this is the first DRAM approximation framework tailored for ML inference that is wireless-driven, time-aware, and task-adaptive. Experimental evaluation demonstrates that, while preserving inference accuracy, WMAA reduces standby energy consumption by up to 62.3% compared to conventional always-on memory schemes, yielding significant energy-efficiency gains. The approach establishes a new low-power memory optimization paradigm for edge AI systems.

Technology Category

Application Category

📝 Abstract
The use of Dynamic Random Access Memory (DRAM) for storing Machine Learning (ML) models plays a critical role in accelerating ML inference tasks in the next generation of communication systems. However, periodic refreshment of DRAM results in wasteful energy consumption during standby periods, which is significant for resource-constrained Internet of Things (IoT) devices. To solve this problem, this work advocates two novel approaches: 1) wireless memory activation and 2) wireless memory approximation. These enable the wireless devices to efficiently manage the available memory by considering the timing aspects and relevance of ML model usage; hence, reducing the overall energy consumption. Numerical results show that our proposed scheme can realize smaller energy consumption than the always-on approach while satisfying the retrieval accuracy constraint.
Problem

Research questions and friction points this paper is trying to address.

Reducing DRAM refresh energy waste in IoT devices
Optimizing memory usage for ML inference efficiency
Balancing energy consumption with retrieval accuracy constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wireless memory activation for energy management
Wireless memory approximation for relevance consideration
Reduces energy use while maintaining accuracy
🔎 Similar Papers
No similar papers found.