Publications: Improving Diversity in Language Models: When Temperature Fails, Change the Loss (ICML 2025); SCOPE: A Self-supervised Framework for Improving Faithfulness in Conditional Text Generation (ICLR 2025); Exploring Precision and Recall to assess the quality and diversity of LLMs (ACL 2024); LOCOST: State-Space Models for Long Document Abstractive Summarization (EACL 2024, Best Paper Award). Awards: Best Paper Award EACL 2024.
Research Experience
May 2022-October 2022: SIPGA Awardee. Research intern at A*STAR (I2R) in Singapore under the supervision of Nancy F. Chen and Mathieu Ravaut. Worked on long documents abstractive summarization.
Education
2023-Today: PhD student in Machine Learning for NLP at Universite Paris Dauphine-PSL; 2021–2022: MSc in Applied Mathematics at École Normale Supérieure Paris-Saclay; 2018–2022: Engineering degree (BSc + MSc equivalent) from CentraleSupélec, Université Paris-Saclay, specialized in mathematics with a focus on machine learning and statistics.
Background
Research Interests: Probabilistic machine learning, Large Language Models, Efficient generative architectures, Evaluation of Large Language Models, Faithfulness of LLMs. Short Bio: Currently a PhD student in Machine Learning for NLP at Université Paris Dauphine-PSL, part of the labs LAMSADE (MILES Team) and ISIR (MLIA Team).
Miscellany
Current Teaching Activities: Large Language Models (M2 IASD), Natural Language Processing (Exec Master IASD).