🤖 AI Summary
Deploying distributed lightweight workloads on resource-constrained ultra-micro edge devices (e.g., Raspberry Pi Zero) without centralized data centers poses significant challenges in achieving low latency, low power consumption, and localized execution.
Method: This paper proposes Pico-Cloud, a micro-edge cloud architecture integrating lightweight container virtualization, decentralized service discovery, and minimalist orchestration to jointly optimize computation, networking, storage, and energy efficiency on edge hardware.
Contribution/Results: Evaluated on single-board computer clusters, Pico-Cloud sustains stable distributed cloud services with end-to-end latency under 50 ms and >60% power reduction. Unlike conventional edge cloud approaches, it is the first fully decentralized micro-cloud solution featuring hardware cost <$15 per node and autonomous offline operation. It establishes a scalable, low-cost, and green infrastructure paradigm for applications including rural connectivity, educational computing clusters, and edge AI inference.
📝 Abstract
This paper introduces the Pico-Cloud, a micro-edge cloud architecture built on ultra-minimal hardware platforms such as the Raspberry Pi Zero and comparable single-board computers. The Pico-Cloud delivers container-based virtualization, service discovery, and lightweight orchestration directly at the device layer, enabling local operation with low latency and low power consumption without reliance on centralized data centers. We present its architectural model, outline representative use cases including rural connectivity, educational clusters, and edge AI inference, and analyze design challenges in computation, networking, storage, and power management. The results highlight Pico-Clouds as a cost-effective, decentralized, and sustainable platform for lightweight distributed workloads at the network edge.