Generalized Exponentiated Gradient Algorithms Using the Euler Two-Parameter Logarithm

📅 2025-02-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the instability of returns and insufficient robustness in Online Portfolio Selection (OPS). To this end, we propose the Generalized Exponential Gradient (GEG) algorithm, which incorporates an Euler-type two-parameter logarithmic link function within the mirror descent framework. GEG employs the associated Bregman divergence as a regularizer and—uniquely among gradient-based OPS methods—integrates a learnable, two-parameter deformation of the logarithmic/exponential function directly into the gradient update, enabling data-distribution adaptivity and controllable optimization dynamics. Compared to conventional single-parameter entropy-regularized approaches, GEG extends the theoretical frontier of generalized entropy optimization, offering enhanced parametric flexibility and structural interpretability. Empirical evaluation demonstrates that GEG significantly improves cumulative return stability and robustness against market disturbances in OPS tasks, validating the efficacy of multi-parameter functional deformation in regulating online optimization dynamics.

Technology Category

Application Category

📝 Abstract
In this paper we propose and investigate a new class of Generalized Exponentiated Gradient (GEG) algorithms using Mirror Descent (MD) approaches, and applying as a regularization function the Bregman divergence with two-parameter deformation of logarithm as a link function. This link function (referred to as the Euler logarithm) is associated with a wide class of generalized entropies. In order to derive novel GEG/MD updates, we estimate generalized exponential function, which closely approximates the inverse of the Euler two-parameter logarithm. The characteristic/shape and properties of the Euler logarithm and its inverse -- deformed exponential functions are tuned by two or even more hyperparameters. By learning these hyperparameters, we can adapt to distribution of training data, and we can adjust them to achieve desired properties of gradient descent algorithms. The concept of generalized entropies and associated deformed logarithms provide deeper insight into novel gradient descent updates. In literature, there exist nowadays over fifty mathematically well-defined entropic functionals and associated deformed logarithms, so impossible to investigate all of them in one research paper. Therefore, we focus here on a wide-class of trace-form entropies and associated generalized logarithm. We applied the developed algorithms for Online Portfolio Selection (OPLS) in order to improve its performance and robustness.
Problem

Research questions and friction points this paper is trying to address.

Develop Generalized Exponentiated Gradient algorithms
Use Euler two-parameter logarithm
Enhance Online Portfolio Selection performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Euler logarithm as link function
Bregman divergence for regularization
Hyperparameter tuning for gradient descent
🔎 Similar Papers
No similar papers found.