🤖 AI Summary
This paper addresses the online no-regret caching problem under partial observability of request histories—a realistic constraint in edge caching. Existing algorithms assume full history observability and incur high computational overhead, rendering them impractical for resource-constrained edge devices such as cellular base stations. To bridge this gap, we propose the first adaptation of the Follow-the-Perturbed-Leader (FPL) framework to the partially observable setting, yielding a lightweight randomized caching policy. Our approach employs delayed updates and compact state representation to achieve an asymptotically optimal regret bound of $O(sqrt{T})$ while maintaining $O(1)$ amortized time complexity per request. Extensive experiments on both synthetic and real-world mobility traces demonstrate that our policy significantly outperforms classical baselines—including LRU and LFU—achieving simultaneous breakthroughs in theoretical guarantees and practical efficiency.
📝 Abstract
Online learning algorithms have been successfully used to design caching policies with sublinear regret in the total number of requests, with no statistical assumption about the request sequence. Most existing algorithms involve computationally expensive operations and require knowledge of all past requests. However, this may not be feasible in practical scenarios like caching at a cellular base station. Therefore, we study the caching problem in a more restrictive setting where only a fraction of past requests are observed, and we propose a randomized caching policy with sublinear regret based on the classic online learning algorithm Follow-the-Perturbed-Leader (FPL). Our caching policy is the first to attain the asymptotically optimal regret bound while ensuring asymptotically constant amortized time complexity in the partial observability setting of requests. The experimental evaluation compares the proposed solution against classic caching policies and validates the proposed approach under synthetic and real-world request traces.