Orthogonium : A Unified, Efficient Library of Orthogonal and 1-Lipschitz Building Blocks

📅 2026-01-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing implementations of orthogonal and 1-Lipschitz neural network layers are fragmented, functionally limited, and computationally expensive, hindering their adoption in robust deep learning. This work proposes the first unified and efficient PyTorch library that integrates a comprehensive suite of orthogonal and 1-Lipschitz building blocks, supporting standard convolutional operations—including striding, dilation, grouping, and transposed convolutions—while employing optimized parameterizations to rigorously enforce orthogonality and Lipschitz constraints. The library corrects critical errors present in prior implementations, substantially lowers the barrier to entry, enhances training stability on large-scale benchmarks such as ImageNet, reduces computational overhead, and demonstrates correctness and reliability through systematic validation.

Technology Category

Application Category

📝 Abstract
Orthogonal and 1-Lipschitz neural network layers are essential building blocks in robust deep learning architectures, crucial for certified adversarial robustness, stable generative models, and reliable recurrent networks. Despite significant advancements, existing implementations remain fragmented, limited, and computationally demanding. To address these issues, we introduce Orthogonium , a unified, efficient, and comprehensive PyTorch library providing orthogonal and 1-Lipschitz layers. Orthogonium provides access to standard convolution features-including support for strides, dilation, grouping, and transposed-while maintaining strict mathematical guarantees. Its optimized implementations reduce overhead on large scale benchmarks such as ImageNet. Moreover, rigorous testing within the library has uncovered critical errors in existing implementations, emphasizing the importance of standardized and reliable tools. Orthogonium thus significantly lowers adoption barriers, enabling scalable experimentation and integration across diverse applications requiring orthogonality and robust Lipschitz constraints. Orthogonium is available at https://github.com/deel-ai/orthogonium.
Problem

Research questions and friction points this paper is trying to address.

orthogonal layers
1-Lipschitz
robust deep learning
fragmented implementations
computational overhead
Innovation

Methods, ideas, or system contributions that make the work stand out.

orthogonal layers
1-Lipschitz networks
robust deep learning
efficient implementation
PyTorch library
🔎 Similar Papers
No similar papers found.
Thibaut Boissin
Thibaut Boissin
IRT Saint-Exupéry
Neural NetworksRobustnessComputer visionMachine Learning
Franck Mamalet
Franck Mamalet
Senior Expert in Artificial Intelligence, IRT St Exupery
RobustnessOptimal TransportNeural NetworksDeep LearningImage processing
V
Valentin Lafargue
1 Institut de Recherche Technologique Saint-Exupéry, Toulouse, France; 2 Artificial and Natural Intelligence Toulouse Institute, France; 4 now at IMT, Toulouse, and INRIA, Bordeaux, France
M
Mathieu Serrurier
3 IRIT, Toulouse, France; 2 Artificial and Natural Intelligence Toulouse Institute, France