🤖 AI Summary
This paper addresses the time-sliced task allocation problem in a Markovian state machine setting, aiming to jointly minimize the Age of Job Completion (AoJC) and sampling cost while ensuring stability of $N$ user queues. Recognizing that conventional age metrics fail to capture task completion timeliness, we introduce AoJC as a novel performance metric. Building upon this, we propose two state-aware stable scheduling policies that integrate stochastic arrival modeling, Markov chain-based state machine analysis, and Lyapunov drift-plus-penalty theory. Numerical experiments demonstrate that our policies reduce AoJC by an average of 23.6% compared to baselines, maintain strong system stability under tight sampling budgets, and outperform existing approaches in both age reduction and queue stability guarantees.
📝 Abstract
We consider a time-slotted job-assignment system with a central server, N users and a machine which changes its state according to a Markov chain (hence called a Markov machine). The users submit their jobs to the central server according to a stochastic job arrival process. For each user, the server has a dedicated job queue. Upon receiving a job from a user, the server stores that job in the corresponding queue. When the machine is not working on a job assigned by the server, the machine can be either in internally busy or in free state, and the dynamics of these states follow a binary symmetric Markov chain. Upon sampling the state information of the machine, if the server identifies that the machine is in the free state, it schedules a user and submits a job to the machine from the job queue of the scheduled user. To maximize the number of jobs completed per unit time, we introduce a new metric, referred to as the age of job completion. To minimize the age of job completion and the sampling cost, we propose two policies and numerically evaluate their performance. For both of these policies, we find sufficient conditions under which the job queues will remain stable.