🤖 AI Summary
This paper investigates the identification capacity of molecular mixture communication over a discrete affine Poisson channel—modeling receptor–ligand affinity-based recognition—under task-oriented scenarios (e.g., distinguishing only “danger” vs. “food” events). Using deterministic coding and information-theoretic analysis, we establish that the number of reliably identifiable mixtures grows super-exponentially in the rank $T$ of the affinity matrix ($sim 2^{T log T} R$), even when the number of receptor types grows sublinearly. We derive tight upper and lower bounds on identification capacity, unifying several existing capacity results under a single framework. Our analysis quantifies the fundamental impact of affinity matrix structure on recognition capability and yields an exact characterization of the coding rate $R$. This work provides the first rigorous capacity characterization for affinity-driven molecular identification, revealing how structural properties of biochemical interactions govern scalable, task-specific sensing performance.
📝 Abstract
Identification capacity has been established as a relevant performance metric for various goal-/task-oriented applications, where the receiver may be interested in only a particular message that represents an event or a task. For example, in olfactory molecular communications (MCs), odors or pheromones, which are often a mixture of various molecule types, may signal nearby danger, food, or a mate. In this paper, we examine the identification capacity with deterministic encoder for the discrete affine Poisson channel which can be used to model MC systems with molecule counting receivers. We establish lower and upper bounds on the identification capacity in terms of features of affinity matrix between the released molecules and receptors at the receiver. As a key finding, we show that even when the number of receptor types scales sub-linearly in the number of molecule types $N,$ the number of reliably identifiable mixtures can grow super-exponentially with the rank of the affinity matrix, $T,$ i.e., $sim 2^{(T log T)R},$ where $R$ denotes the coding rate. We further drive lower and upper bounds on $R,$ and show that the proposed capacity theorem includes several known results in the literature as its special cases.