Learning with Incomplete Context: Linear Contextual Bandits with Pretrained Imputation

📅 2025-10-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies the online linear contextual bandit problem under partial context observability and proposes PULSE-UCB, the first algorithm to leverage surrogate features from large-scale pretrained models for missing-feature imputation. The method integrates pretrained-feature imputation, UCB-style online decision-making, and error analysis under a Hölder smoothness assumption. Theoretical contributions include: (1) a precise decomposition of the regret bound into a standard linear bandit term and an additional term governed by pretrained model quality; (2) attainment of a near-optimal regret upper bound under i.i.d. contexts, accompanied by a matching lower bound; and (3) quantification of how prediction uncertainty impacts decision performance, explicitly characterizing the auxiliary data scale required to improve downstream learning. Experiments demonstrate the efficacy of pretrained priors in nonstationary, partially observable settings.

Technology Category

Application Category

📝 Abstract
The rise of large-scale pretrained models has made it feasible to generate predictive or synthetic features at low cost, raising the question of how to incorporate such surrogate predictions into downstream decision-making. We study this problem in the setting of online linear contextual bandits, where contexts may be complex, nonstationary, and only partially observed. In addition to bandit data, we assume access to an auxiliary dataset containing fully observed contexts--common in practice since such data are collected without adaptive interventions. We propose PULSE-UCB, an algorithm that leverages pretrained models trained on the auxiliary data to impute missing features during online decision-making. We establish regret guarantees that decompose into a standard bandit term plus an additional component reflecting pretrained model quality. In the i.i.d. context case with Hölder-smooth missing features, PULSE-UCB achieves near-optimal performance, supported by matching lower bounds. Our results quantify how uncertainty in predicted contexts affects decision quality and how much historical data is needed to improve downstream learning.
Problem

Research questions and friction points this paper is trying to address.

Incorporating pretrained model predictions into online decision-making systems
Addressing incomplete contextual information in linear bandit problems
Quantifying how imputation uncertainty impacts downstream learning performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Imputes missing features using pretrained models
Leverages auxiliary data for context completion
Provides regret guarantees with model quality decomposition
🔎 Similar Papers
No similar papers found.
H
Hao Yan
Department of Statistics, University of Wisconsin - Madison
H
Heyan Zhang
Department of Statistics, University of Wisconsin - Madison
Yongyi Guo
Yongyi Guo
Assistant Professor of Statistics, University of Wisconsin-Madison