🤖 AI Summary
Existing Lipschitz constant estimators for ReLU networks (e.g., ResNet, VGG, U-Net) lack scale invariance and architectural universality. To address this, we propose a novel path-metric-based parametric Lipschitz bound that rigorously preserves symmetry under parameter rescaling—ensuring exact scale invariance—and is the first to apply uniformly across mainstream modern architectures. By rigorously analyzing the geometric structure and symmetry-invariant properties of the ReLU network parameter space, our method yields significantly tighter Lipschitz estimates than prior work, delivering more precise, architecture-agnostic robustness guarantees. The resulting bound provides theoretically grounded, practical upper limits for tasks including model generalization analysis, weight quantization, and neural network pruning.
📝 Abstract
Lipschitz bounds on neural network parameterizations are important to establish generalization, quantization or pruning guarantees, as they control the robustness of the network with respect to parameter changes. Yet, there are few Lipschitz bounds with respect to parameters in the literature, and existing ones only apply to simple feedforward architectures, while also failing to capture the intrinsic rescaling-symmetries of ReLU networks. This paper proves a new Lipschitz bound in terms of the so-called path-metrics of the parameters. Since this bound is intrinsically invariant with respect to the rescaling symmetries of the networks, it sharpens previously known Lipschitz bounds. It is also, to the best of our knowledge, the first bound of its kind that is broadly applicable to modern networks such as ResNets, VGGs, U-nets, and many more.