Yuki Takezawa
Scholar

Yuki Takezawa

Google Scholar ID: eaKQb8IAAAAJ
Kyoto University / OIST
Machine LearningOptimizationOptimal Transport
Citations & Impact
All-time
Citations
184
 
H-index
8
 
i10-index
8
 
Publications
20
 
Co-authors
23
list available
Resume (English only)
Academic Achievements
  • Conference Papers:
  • - “Any-stepsize Gradient Descent for Separable Data under Fenchel–Young Losses”, NeurIPS 2025
  • - “Revisiting 1-peer Exponential Graph for Enhancing Decentralized Learning Efficiency”, NeurIPS 2025
  • - “Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization”, ICML 2025
  • - “Scalable Decentralized Learning with Teleportation”, ICLR 2025
  • - “PhiNets: Brain-inspired Non-contrastive Learning Based on Temporal Prediction Hypothesis”, ICLR 2025
  • - “Parameter-free Clipped Gradient Descent Meets Polyak”, NeurIPS 2024
  • - “Large-scale Similarity Search with Optimal Transport”, EMNLP 2023
  • - “Beyond Exponential Graph: Communication-Efficient Topologies for Decentralized Learning via Finite-time Convergence”, NeurIPS 2023
  • - “Improving the Robustness to Variations of Objects and Instructions with A Neuro-Symbolic Approach for Interactive Instruction Following”, ICMM 2023
  • - “Fixed Support Tree-Sliced Wasserstein Barycenter”, AISTATS 2022
  • - “Supervised Tree-Wasserstein Distance”, ICML 2021
  • Journal Papers:
  • - “A Bias Correction Mechanism for Distributed Asynchronous Optimization”, Transactions on Machine Learning Research 2025
  • - “Necessary and Sufficient Watermark for Large Language Models”, Transactions on Machine Learning Research 2025
  • - “An Empirical Study of Simplicial Representation Learning with Wasserstein Distance”, Entropy 2024
  • - “A Localized Primal-Dual Method for Centralized/Decentralized Federated Learning Robust to Data Heterogeneity”, IEEE Transactions on Signal and Information Processing over Networks 2023
  • - “Momentum Tracking: Momentum Acceleration for Decentralized Deep Learning on Heterogeneous Data”, Transactions on Machine Learning Research 2023
  • - “Communication Compression for Decentralized Learning with Operator Splitting Methods”, IEEE Transactions on Signal and Information Processing over Networks 2023
Research Experience
  • Work Experience:
  • - Visiting Research Student, Yamada Unit, Okinawa Institute of Science and Technology
  • - Ph.D. Student, Kashima Laboratory, Kyoto University
Education
  • Degree: Ph.D.
  • University: Kyoto University
  • Advisor: Not explicitly mentioned
  • Time: Ongoing
  • Field: Not explicitly mentioned
Background
  • Research interests: Machine Learning, Optimization, Optimal Transport. 3rd year Ph.D. student at Kashima Laboratory, Kyoto University. Also working as a visiting research student at Yamada Unit, Okinawa Institute of Science and Technology.