A rescaling-invariant Lipschitz bound based on path-metrics for modern ReLU network parameterizations

📅 2024-05-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Lipschitz constant estimators for ReLU networks (e.g., ResNet, VGG, U-Net) lack scale invariance and architectural universality. To address this, we propose a novel path-metric-based parametric Lipschitz bound that rigorously preserves symmetry under parameter rescaling—ensuring exact scale invariance—and is the first to apply uniformly across mainstream modern architectures. By rigorously analyzing the geometric structure and symmetry-invariant properties of the ReLU network parameter space, our method yields significantly tighter Lipschitz estimates than prior work, delivering more precise, architecture-agnostic robustness guarantees. The resulting bound provides theoretically grounded, practical upper limits for tasks including model generalization analysis, weight quantization, and neural network pruning.

Technology Category

Application Category

📝 Abstract
Lipschitz bounds on neural network parameterizations are important to establish generalization, quantization or pruning guarantees, as they control the robustness of the network with respect to parameter changes. Yet, there are few Lipschitz bounds with respect to parameters in the literature, and existing ones only apply to simple feedforward architectures, while also failing to capture the intrinsic rescaling-symmetries of ReLU networks. This paper proves a new Lipschitz bound in terms of the so-called path-metrics of the parameters. Since this bound is intrinsically invariant with respect to the rescaling symmetries of the networks, it sharpens previously known Lipschitz bounds. It is also, to the best of our knowledge, the first bound of its kind that is broadly applicable to modern networks such as ResNets, VGGs, U-nets, and many more.
Problem

Research questions and friction points this paper is trying to address.

Lipschitz bounds
neural networks
parameter sensitivity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lipschitz boundary
ReLU neural networks
scale-invariant path metric
🔎 Similar Papers
No similar papers found.
Antoine Gonon
Antoine Gonon
Post-doctoral Researcher, EPFL
Deep learning
N
Nicolas Brisebarre
CNRS, ENS de Lyon, Université Claude Bernard Lyon 1, Inria, LIP, UMR 5668, 69342, Lyon cedex 07, France
E
E. Riccietti
ENS de Lyon, CNRS, Université Claude Bernard Lyon 1, Inria, LIP, UMR 5668, 69342, Lyon cedex 07, France
Rémi Gribonval
Rémi Gribonval
Inria & ENS de Lyon
signal processingmachine learningsparsityinverse problemsdimension reduction