On the monotonicity of discrete entropy for log-concave random vectors on Zd

📅 2024-01-27
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the discrete entropy monotonicity of partial sums of isotropic, log-concave, integer-valued random vectors on $mathbb{Z}^d$. For i.i.d. $X_1,dots,X_{n+1}$, it establishes a quantitative lower bound: $$H!left(sum_{i=1}^{n+1} X_i ight) geq H!left(sum_{i=1}^n X_i ight) + frac{d}{2}logfrac{n+1}{n} + o(1),$$ where $o(1) = O(H(X_1)e^{-H(X_1)/d})$. Methodologically, the proof integrates convex geometric analysis, integral estimates for log-concave functions, construction of isotropic positions, and moment approximation techniques. Contributions include: (i) the first extension of discrete entropy monotonicity from one dimension to higher-dimensional lattices; (ii) introduction of the “nearly isotropic” condition as a relaxation of strict isotropy; (iii) establishment of a refined approximation between discrete and differential entropy; and (iv) derivation of a tight upper bound on the discrete isotropic constant. The results advance the understanding of entropy growth in discrete, high-dimensional log-concave settings.

Technology Category

Application Category

📝 Abstract
We prove the following type of discrete entropy monotonicity for sums of isotropic, log-concave, independent and identically distributed random vectors $X_1,dots,X_{n+1}$ on $mathbb{Z}^d$: $$ H(X_1+cdots+X_{n+1}) geq H(X_1+cdots+X_{n}) + frac{d}{2}log{Bigl(frac{n+1}{n}Bigr)} +o(1), $$ where $o(1)$ vanishes as $H(X_1) o infty$. Moreover, for the $o(1)$-term, we obtain a rate of convergence $ OBigl({H(X_1)}{e^{-frac{1}{d}H(X_1)}}Bigr)$, where the implied constants depend on $d$ and $n$. This generalizes to $mathbb{Z}^d$ the one-dimensional result of the second named author (2023). As in dimension one, our strategy is to establish that the discrete entropy $H(X_1+cdots+X_{n})$ is close to the differential (continuous) entropy $h(X_1+U_1+cdots+X_{n}+U_{n})$, where $U_1,dots, U_n$ are independent and identically distributed uniform random vectors on $[0,1]^d$ and to apply the theorem of Artstein, Ball, Barthe and Naor (2004) on the monotonicity of differential entropy. In fact, we show this result under more general assumptions than log-concavity, which are preserved up to constants under convolution. In order to show that log-concave distributions satisfy our assumptions in dimension $dge2$, more involved tools from convex geometry are needed because a suitable position is required. We show that, for a log-concave function on $mathbb{R}^d$ in isotropic position, its integral, barycenter and covariance matrix are close to their discrete counterparts. Moreover, in the log-concave case, we weaken the isotropicity assumption to what we call almost isotropicity. One of our technical tools is a discrete analogue to the upper bound on the isotropic constant of a log-concave function, which extends to dimensions $dge1$ a result of Bobkov, Marsiglietti and Melbourne (2022).
Problem

Research questions and friction points this paper is trying to address.

Proving discrete entropy monotonicity for log-concave random vectors
Generalizing one-dimensional entropy results to higher dimensions
Establishing connections between discrete and differential entropy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proves discrete entropy monotonicity for log-concave vectors
Links discrete and differential entropy via uniform vectors
Extends one-dimensional results to higher dimensions
🔎 Similar Papers
No similar papers found.
M
Matthieu Fradelizi
Univ Gustave Eiffel, Univ Paris Est Creteil, CNRS, LAMA UMR8050 F-77447 Marne-la-Vallée, France
Lampros Gavalakis
Lampros Gavalakis
University of Cambridge
Information TheoryProbability
M
Martin Rapaport
Univ Gustave Eiffel, Univ Paris Est Creteil, CNRS, LAMA UMR8050 F-77447 Marne-la-Vallée, France