Quantization vs Pruning: Insights from the Strong Lottery Ticket Hypothesis

📅 2025-08-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Strong Lottery Ticket Hypothesis (SLTH) theory is confined to continuous-weight settings and fails to characterize the interplay between pruning and representational capacity in quantized neural networks—i.e., those with finite-precision weights. Method: Leveraging integer partition theory, combined with probabilistic analysis and combinatorial mathematics, we first extend SLTH to the quantized setting; specifically, we model pruning via random subnetworks and employ Borgs’ partition bounds to rigorously establish exact representability of discrete-weight networks. Contribution: We prove that any target quantized network—including binary networks—can be exactly realized via pruning from a sufficiently over-parameterized initial random network, and we derive an optimal upper bound on the required over-parameterization. This work establishes the first theoretical framework for quantized SLTH, revealing the intrinsic expressive power of low-precision networks under discrete weight constraints.

Technology Category

Application Category

📝 Abstract
Quantization is an essential technique for making neural networks more efficient, yet our theoretical understanding of it remains limited. Previous works demonstrated that extremely low-precision networks, such as binary networks, can be constructed by pruning large, randomly-initialized networks, and showed that the ratio between the size of the original and the pruned networks is at most polylogarithmic. The specific pruning method they employed inspired a line of theoretical work known as the Strong Lottery Ticket Hypothesis (SLTH), which leverages insights from the Random Subset Sum Problem. However, these results primarily address the continuous setting and cannot be applied to extend SLTH results to the quantized setting. In this work, we build on foundational results by Borgs et al. on the Number Partitioning Problem to derive new theoretical results for the Random Subset Sum Problem in a quantized setting. Using these results, we then extend the SLTH framework to finite-precision networks. While prior work on SLTH showed that pruning allows approximation of a certain class of neural networks, we demonstrate that, in the quantized setting, the analogous class of target discrete neural networks can be represented exactly, and we prove optimal bounds on the necessary overparameterization of the initial network as a function of the precision of the target network.
Problem

Research questions and friction points this paper is trying to address.

Extends Strong Lottery Ticket Hypothesis to quantized neural networks
Proves exact representation of discrete neural networks via pruning
Determines optimal overparameterization bounds for finite-precision networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends SLTH to quantized neural networks
Uses Random Subset Sum Problem insights
Proves optimal overparameterization bounds
🔎 Similar Papers
No similar papers found.
A
Aakash Kumar
Department of Physical Sciences, IISER Kolkata, West Bengal, India 741246
Emanuele Natale
Emanuele Natale
CNRS, Université Côte d'Azur, I3S, INRIA
Machine LearningNeuroscienceTheoretical Computer Science