🤖 AI Summary
To address the limited expressivity and gradient instability of static activation functions (e.g., ReLU) in deep neural networks, this work introduces a novel family of learnable activations constructed from orthogonal function bases (Hermite, Fourier) and tropical polynomials—unifying polynomial, trigonometric, and tropical algebraic forms within a differentiable framework for the first time. To ensure training stability in deep CNNs and Transformers, we propose a variance-preserving initialization scheme. All components are implemented natively in PyTorch and publicly released as the open-source library *torchortho*. Extensive experiments on ImageNet-1K image classification and OpenWebText language modeling demonstrate consistent improvements over GPT-2 and ConvNeXt, achieving higher accuracy and lower perplexity. These results validate the effectiveness, generality, and scalability of learnable activations across large-scale vision and language tasks.
📝 Abstract
This paper investigates scalable neural networks with learnable activation functions based on orthogonal function bases and tropical polynomials, targeting ImageNet-1K classification and next token prediction on OpenWebText. Traditional activations, such as ReLU, are static. In contrast, learnable activations enable the network to adapt dynamically during training. However, stability issues, such as vanishing or exploding gradients, arise with improper variance management in deeper networks. To remedy this, we propose an initialization scheme that single-handedly preserves unitary variance in transformers and convolutional networks, ensuring stable gradient flow even in deep architectures. Extensive experiments demonstrate that networks with Hermite, Fourier, and Tropical-based learnable activations significantly improve over GPT-2 and ConvNeXt networks in terms of accuracy and perplexity in train and test, highlighting the viability of learnable activations in large-scale tasks. The activation functions developed here are the subject of a library coded entirely in pure PyTorch: torchortho, available at https://github.com/K-H-Ismail/torchortho.