🤖 AI Summary
This paper studies Bayesian information design in settings with a continuous state space and a discrete signal space, focusing on characterizing optimal signal structures under convex and S-shaped value functions. Methodologically, it employs tools from convex analysis, Bayesian updating, and curvature-based value function analysis to construct an analytical framework that quantitatively captures signal coarseness. The main contributions are threefold: (i) It establishes—rigorously and generally—that for any value function, the optimal information structure partitions the state space into intervals; (ii) it identifies, for the first time, a “double-expectation” property inherent to this interval structure: within each signal-induced state subset, the posterior mean equals the expectation of the signal’s induced action; and (iii) it unifies treatment of arbitrary value functions and coarse mechanisms, delivering a foundational structural theorem for information design and a computationally tractable modeling pathway.
📝 Abstract
We study an information design problem with continuous state and discrete signal space. Under convex value functions, the optimal information structure is interval-partitional and exhibits a dual expectations property: each induced signal is the conditional mean (taken under the prior density) of each interval; and each interval cutoff is the conditional mean (taken under the value function curvature) of the interval formed by neighboring signals. This property enables an examination into which part of the state space is more finely partitioned. The analysis can be extended to general value functions and adapted to study coarse mechanism design.