- From Data to Knowledge: Evaluating How Efficiently Language Models Learn Facts
- Towards a Principled Evaluation of Knowledge Editors
- DIN SPEC 91526:2025-05, Wissensgraphen für Sprachmodelle und Sprachmodelle für Wissensgraphen - Hybride Anwendungen symbolischer und subsymbolischer KI; Text Englisch
- TransformerRanker: A Tool for Efficiently Finding the Best-Suited Language Models for Downstream Classification Tasks
- Familiarity: Better Evaluation of Zero-Shot Named Entity Recognition by Quantifying Label Shifts in Synthetic Training Data
Research Experience
Involved in multiple research projects, including the Transformer Ranker project, which aims to quickly find the best-suited language model for a given NLP classification task.
Education
PhD student at Humboldt Universität zu Berlin (machine learning group), specific time not mentioned.
Background
Interested in continual & transfer learning, knowledge probing, and growing neural networks. Currently a PhD student at Humboldt Universität zu Berlin (machine learning group) and a member of the DFG Cluster of Excellence 'Science of Intelligence (SCIoI)'