2022: Contributed to the development of gaBERT - an Irish Language Model, presented at LREC 2022.
2021: Co-authored a paper revisiting tri-training of dependency parsers, presented at EMNLP 2021; also part of the team that developed the DCU-EPFL Enhanced Dependency Parser for the IWPT 2021 Shared Task; co-wrote a paper on Jupyter notebook assignments for an introductory NLP course, presented at the Fifth Workshop on Teaching NLP.
2020: Co-author on a review of the state-of-the-art in automatic post-editing, published in Machine Translation Journal; also contributed to a roadmap for neural automatic post-editing, also published in Machine Translation Journal; participated in the ADAPT Enhanced Dependency Parser project for the IWPT 2020 Shared Task; co-authored a study on treebank embedding vectors for out-of-domain dependency parsing, presented at ACL 2020.
Research Experience
Currently a Research Fellow at the ADAPT Centre, School of Computing, Dublin City University. Actively collaborating with Jennifer Foster (on interpretability, parsing, and domain adaptation). Supervising PhD student James Barry (on universal dependency parsing). Former student Utsab Barman worked on code-switching in social media content.
Background
Research interests include interpretability of neural networks for natural language processing, parsing, corpus pre-processing and cleaning, distributed representation, domain adaption, and semi-supervised learning.