🤖 AI Summary
This work addresses the failure of Wormald’s differential equation method when only an upper bound on the expected one-step change is available—lacking a lower bound or tight estimate—and establishes, for the first time, a theoretical framework under one-sided constraints. Methodologically, it integrates martingale inequalities, coupling techniques, and discrete dynamical systems analysis to derive a general one-sided concentration theorem. Theoretically, it rigorously proves that an upper bound alone suffices to guarantee, with high probability, that the trajectory of a random process stays close to the solution of the associated deterministic differential equation; moreover, the original Wormald method emerges naturally as a special (non-degenerate) case. Practically, this significantly lowers technical barriers, providing a more flexible and robust analytical tool for settings lacking symmetric estimates—such as greedy algorithm analysis and stochastic graph evolution processes.
📝 Abstract
In this note, we formulate a"one-sided"version of Wormald's differential equation method. In the standard"two-sided"method, one is given a family of random variables which evolve over time and which satisfy some conditions including a tight estimate of the expected change in each variable over one time step. These estimates for the expected one-step changes suggest that the variables ought to be close to the solution of a certain system of differential equations, and the standard method concludes that this is indeed the case. We give a result for the case where instead of a tight estimate for each variable's expected one-step change, we have only an upper bound. Our proof is very simple, and is flexible enough that if we instead assume tight estimates on the variables, then we recover the conclusion of the standard differential equation method.