Efficient Logistic Regression with Mixture of Sigmoids

📅 2026-04-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of high computational complexity and poor geometric adaptivity in online logistic regression under large-norm hypotheses. The authors propose an exponential weights algorithm with an isotropic Gaussian prior, which reduces the computational complexity from $O(B^{18}n^{37})$ to $O(B^3 n^5)$—the first such improvement of this magnitude—and achieves a near-optimal non-asymptotic regret bound of $O(d \log(Bn))$. In the separable large-margin regime, the algorithm’s prediction direction asymptotically converges to that of the hard-margin support vector machine, effectively implementing a voting mechanism over the solid angle of separating hyperplanes. Notably, the regret grows only at a logarithmic rate in the inverse margin, thereby unifying computational efficiency with geometric adaptivity.
📝 Abstract
This paper studies the Exponential Weights (EW) algorithm with an isotropic Gaussian prior for online logistic regression. We show that the near-optimal worst-case regret bound $O(d\log(Bn))$ for EW, established by Kakade and Ng (2005) against the best linear predictor of norm at most $B$, can be achieved with total worst-case computational complexity $O(B^3 n^5)$. This substantially improves on the $O(B^{18}n^{37})$ complexity of prior work achieving the same guarantee (Foster et al., 2018). Beyond efficiency, we analyze the large-$B$ regime under linear separability: after rescaling by $B$, the EW posterior converges as $B\to\infty$ to a standard Gaussian truncated to the version cone. Accordingly, the predictor converges to a solid-angle vote over separating directions and, on every fixed-margin slice of this cone, the mode of the corresponding truncated Gaussian is aligned with the hard-margin SVM direction. Using this geometry, we derive non-asymptotic regret bounds showing that once $B$ exceeds a margin-dependent threshold, the regret becomes independent of $B$ and grows only logarithmically with the inverse margin. Overall, our results show that EW can be both computationally tractable and geometrically adaptive in online classification.
Problem

Research questions and friction points this paper is trying to address.

online logistic regression
Exponential Weights algorithm
computational complexity
regret bound
linear separability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Exponential Weights
Online Logistic Regression
Computational Complexity
Version Cone
Hard-Margin SVM
🔎 Similar Papers
No similar papers found.