Sequential Adversarial Hypothesis Testing

📅 2024-07-07
🏛️ International Symposium on Information Theory
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies sequential adversarial binary hypothesis testing: each hypothesis corresponds to a closed convex set of distributions, and an adversary—aware of the observation history—dynamically selects the generating distribution from the respective set; the detector employs variable-length sampling under a constraint on the expected sample size. Leveraging tools from information theory, large deviations theory, and convex optimization, we characterize, for the first time, the closure of the achievable error exponent pairs (i.e., type-I and type-II error exponents) in this adversarial setting. Our main contribution is a precise characterization of the optimal error exponent trade-off region under a given expected sampling length constraint. This reveals fundamental limits imposed by adversarial distribution selection on detection performance and establishes a theoretical benchmark and design principle for robust sequential detection.

Technology Category

Application Category

📝 Abstract
We study the adversarial binary hypothesis testing problem [1] in the sequential setting. Associated with each hypothesis is a closed, convex set of distributions. Given the hypothesis, each observation is generated according to a distribution chosen (from the set associated with the hypothesis) by an adversary who has access to past observations. In the sequential setting, the number of observations the detector uses to arrive at a decision is variable; however there is a constraint on the expected number of observations used. We characterize the closure of the set of achievable pairs of error exponents.
Problem

Research questions and friction points this paper is trying to address.

Sequential adversarial binary hypothesis testing with convex distribution sets
Characterizing achievable error exponent pairs in sequential detection
Analyzing performance under observation and error probability constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adversarial hypothesis testing with sequential observations
Convex distribution sets chosen by adversary
Variable observation count optimizes error exponents
🔎 Similar Papers
No similar papers found.
E
Eeshan Modak
Tata Institute of Fundamental Research, Mumbai, India
Mayank Bakshi
Mayank Bakshi
Research Scientist, Arizona State University
Information TheoryNetwork SecurityNetwork CodingSparse Recovery
B
B. Dey
Indian Institute of Technology Bombay, Mumbai, India
V
V. Prabhakaran
Tata Institute of Fundamental Research, Mumbai, India