Released 'From RAG to Memory: Non-Parametric Continual Learning for Large Language Models' (Feb 2025), outperforming state-of-the-art graph-augmented and standard RAG systems.
Paper 'Attention in Large Language Models Yields Efficient Zero-Shot Re-Rankers' accepted to ICLR 2025.
Paper 'HippoRAG: Neurobiologically Inspired Long-Term Memory for Large Language Models' accepted to NeurIPS 2024.
Multiple papers accepted to ACL, EMNLP, and BioNLP Workshop on topics including clinical reading comprehension, UMLS vocabulary insertion, relation extraction via QA alignment, and limitations of GPT-3 for biomedical IE.
Awarded an Accelerator Grant from the Translational Data Analytics Institute (March 2021) for social media pharmacovigilance research.