Antonio Silveti-Falls
Scholar

Antonio Silveti-Falls

Google Scholar ID: t8GKm5kAAAAJ
CentraleSupélec, Université Paris-Saclay
Nonsmooth OptimizationStochastic OptimizationDeep Learning
Citations & Impact
All-time
Citations
222
 
H-index
7
 
i10-index
7
 
Publications
11
 
Co-authors
12
list available
Resume (English only)
Academic Achievements
  • Publications:
  • - 'Adaptive Conditional Gradient Descent' where a better backtracking for adaptive Frank-Wolfe and steepest descent is proposed.
  • - 'Generalized Gradient Norm Clipping & Non-Euclidean (L0,L1)-Smoothness' accepted at NeurIPS 2025 as an oral presentation.
  • - 'Training Large Neural Networks with Norm-Constrained Linear Minimization Oracles' accepted at ICML 2025 as a spotlight.
  • Projects: Scion, a state-of-the-art optimizer for neural networks.
Research Experience
  • Since September 2022, an associate professor (maître de conférences en français) at CentraleSupélec/University of Paris-Saclay in the Centre pour la Vision Numérique laboratory. Also part of the INRIA team OPIS - OPtImization for large Scale biomedical data. Proud member of the Fédération de Mathématiques de CentraleSupélec.
Background
  • Research interests: Intersection of optimization theory and machine learning, with a focus on developing scalable algorithms for large-scale problems. Specializes in conditional gradient (Frank-Wolfe) methods and their applications, such as deep learning and training large neural networks. Broad interests include {nonsmooth, stochastic, noneuclidean} optimization, using Bregman divergences or relaxed definitions of smoothness, spanning both convex and nonconvex settings. Very interested in developing the theory of path differentiable functions and conservative calculus to study automatic differentiation, especially for implicitly defined functions.
Miscellany
  • Personal interests: Not provided