Hierarchical Zero-Order Optimization for Deep Neural Networks

๐Ÿ“… 2026-02-11
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work proposes Hierarchical Zeroth-Order Optimization (HZO), a novel approach that overcomes the poor scalability of conventional zeroth-order methodsโ€”whose query complexity scales as $O(ML^2)$โ€”to deep neural networks. By introducing a divide-and-conquer strategy along the network depth, HZO departs from the standard layer-wise gradient propagation paradigm and reduces the query complexity to $O(ML \log L)$. The method integrates hierarchical decomposition, rigorous error analysis, and Lipschitz constant control to ensure numerical stability, particularly in the near-unitary regime. Empirical evaluations on CIFAR-10 and ImageNet demonstrate that HZO achieves accuracy comparable to backpropagation, substantially enhancing the scalability and practicality of zeroth-order optimization for deep models.

Technology Category

Application Category

๐Ÿ“ Abstract
Zeroth-order (ZO) optimization has long been favored for its biological plausibility and its capacity to handle non-differentiable objectives, yet its computational complexity has historically limited its application in deep neural networks. Challenging the conventional paradigm that gradients propagate layer-by-layer, we propose Hierarchical Zeroth-Order (HZO) optimization, a novel divide-and-conquer strategy that decomposes the depth dimension of the network. We prove that HZO reduces the query complexity from $O(ML^2)$ to $O(ML \log L)$ for a network of width $M$ and depth $L$, representing a significant leap over existing ZO methodologies. Furthermore, we provide a detailed error analysis showing that HZO maintains numerical stability by operating near the unitary limit ($L_{lip} \approx 1$). Extensive evaluations on CIFAR-10 and ImageNet demonstrate that HZO achieves competitive accuracy compared to backpropagation.
Problem

Research questions and friction points this paper is trying to address.

Zeroth-order optimization
Deep neural networks
Computational complexity
Query complexity
Non-differentiable objectives
Innovation

Methods, ideas, or system contributions that make the work stand out.

Zeroth-order optimization
Hierarchical optimization
Query complexity reduction
Divide-and-conquer
Numerical stability
๐Ÿ”Ž Similar Papers
No similar papers found.
S
Sansheng Cao
College of Physics, Peking University
Zhengyu Ma
Zhengyu Ma
Pengcheng Laboratory
NeuroscienceNeural Network DynamicsComputational Physics
Y
Yonghong Tian
School of Computer Science, Peking University