Publications: 'Procedural Environment Generation for Tool-Use Agents' (EMNLP 2025); 'Playpen: An Environment for Exploring Learning Through Conversational Interaction' (EMNLP 2025); 'Evaluating Spatiotemporal Consistency in Automatically Generated Sewing Instructions' (EMNLP 2025); 'Exploring Graph Representations of Logical Forms for Language Modeling' (ACL 2025); 'Language Modeling over Logical Forms' (Doctoral dissertation, University at Buffalo); 'It is not True that Transformers are Inductive Learners: Probing NLI Models with External Negation' (EACL 2025); 'Positional Transformers for Claim Span Identification' (Forum for Information Retrieval Evaluation, 2023); 'Hate Speech Detection in Low Resource Indo-Aryan Languages' (Forum for Information Retrieval Evaluation, 2023); 'University at Buffalo at SemEval-2023 Task 11: MASDA–Modelling Annotator Sensibilities through DisAggregation' (SemEval-2023), nominated for Best System Paper Award at SemEval 2023; 'Formal-Logical Distributional Semantics: Applications to Property Inference' (Workshop on Knowledge Augmented Methods for Natural Language Processing at AAAI 2023).
Research Experience
Post-Doc Researcher, Computational Linguistics Group at Saarland University; research projects include procedural environment generation for tool-use agents.
Education
PhD in Linguistics, University at Buffalo (2020-2025), advisor: JP Koenig; MSc in Computer Science and Engineering, University at Buffalo (2023-2024), advisor: Rohini K Srihari; BA in Linguistics, The Ohio State University (2016-2019), minors: Spanish, German.
Background
Research interests: logical reasoning and tool use with LLMs; past projects include research on shallow heuristics that NLI models leverage, language models over logical-form representations, and the automatic generation of tool use environments for training LLM agents with reinforcement learning.