NeurIPS 2025: 'Sparse MeZO: Less Parameters for Better Performance in Zeroth-Order LLM Fine-Tuning'
ICML 2025: Two papers including 'SeedLoRA: A Fusion Approach to Efficient LLM Fine-Tuning' and 'MERIT: Maximum-normalized Element-wise Ratio for Language Model Large-batch Training'