Scholar
Hugues Van Assel
Google Scholar ID: 9Lf9wq8AAAAJ
Genentech
Optimal Transport
Representation Learning
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
53
H-index
4
i10-index
3
Publications
14
Co-authors
0
Contact
Email
huguesva@gmail.com
Twitter
Open ↗
GitHub
Open ↗
Publications
3 items
stable-pretraining-v1: Foundation Model Research Made Simple
2025
Cited
0
Joint Embedding vs Reconstruction: Provable Benefits of Latent Space Prediction for Self Supervised Learning
2025
Cited
0
Ditch the Denoiser: Emergence of Noise Robustness in Self-Supervised Learning from Data Curriculum
2025
Cited
0
Resume (English only)
Academic Achievements
Developed TorchDR: a modular, GPU-friendly toolbox for dimensionality reduction offering a unified interface for state-of-the-art methods.
Developed stable-pretraining: a PyTorch library for foundation model pretraining with real-time monitoring.
Published 'Joint Embedding vs Reconstruction: Provable Benefits of Latent Space Prediction for Self-Supervised Learning' at NeurIPS 2025 (Spotlight).
Published 'Distributional Reduction: Unifying Dimensionality Reduction and Clustering with Gromov-Wasserstein' in TMLR 2024.
Published 'SNEkhorn: Dimension Reduction with Symmetric Entropic Affinities' at NeurIPS 2023.
Published 'A Probabilistic Graph Coupling View of Dimension Reduction' at NeurIPS 2022.
Background
Currently a Postdoctoral Fellow at Genentech, working with Aviv Regev and Tommaso Biancalani.
Interested in how machines learn rich and reliable representations of complex data.
Research focuses on representation learning, self-supervised and multi-modal methods, optimal transport, and dimensionality reduction.
Develops computational approaches to uncover data structure, motivated by challenges in the life sciences.
Enjoys building and sharing open-source tools.
Co-authors
0 total
Co-authors: 0 (list not available)
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up