🤖 AI Summary
To address performance degradation in multi-objective deep learning caused by gradient conflicts between primary and auxiliary tasks, this paper proposes Bloop: a gradient surgery method formulated as a constrained bilevel optimization problem. Bloop orthogonally projects auxiliary gradients onto the loss-minimizing manifold to decouple optimization directions of the two tasks. It innovatively employs exponential moving average (EMA) to dynamically estimate and maintain orthogonality between primary and auxiliary gradients, thereby enhancing training stability. The method is fully compatible with mini-batch stochastic gradient updates. Extensive experiments on NLP and vision multi-task benchmarks demonstrate that Bloop consistently outperforms existing gradient surgery approaches—achieving more stable convergence and superior generalization performance.
📝 Abstract
Beyond minimizing a single training loss, many deep learning estimation pipelines rely on an auxiliary objective to quantify and encourage desirable properties of the model (e.g. performance on another dataset, robustness, agreement with a prior). Although the simplest approach to incorporating an auxiliary loss is to sum it with the training loss as a regularizer, recent works have shown that one can improve performance by blending the gradients beyond a simple sum; this is known as gradient surgery. We cast the problem as a constrained minimization problem where the auxiliary objective is minimized among the set of minimizers of the training loss. To solve this bilevel problem, we follow a parameter update direction that combines the training loss gradient and the orthogonal projection of the auxiliary gradient to the training gradient. In a setting where gradients come from mini-batches, we explain how, using a moving average of the training loss gradients, we can carefully maintain this critical orthogonality property. We demonstrate that our method, Bloop, can lead to much better performances on NLP and vision experiments than other gradient surgery methods without EMA.