Paper 'TinyTrain: Resource-Aware Task-Adaptive Sparse Training of DNNs at the Data-Scarce Edge' presented at ICML 2024 and won the Silver Samsung Best Paper Award; 'Meta-Learned Kernel For Blind Super-Resolution Kernel Estimation' accepted at IEEE/CVF WACV 2024; 'A Channel Coding Benchmark for Meta-Learning' accepted at NeurIPS’21 Benchmarks Track; won the Best Student Paper at MLN’18; received the Brendan Murphy Memorial Prize.
Research Experience
Works at Samsung AI; gave a guest lecture for the Federated Learning: Theory and Practice course at the Department of Computer Science and Technology, University of Cambridge; involved in several research projects, including collaboration with Imperial College on hardware-aware parallel prompt decoding.
Education
Obtained her PhD from the School of Informatics, University of Edinburgh in 2019, advised by Prof Paul Patras.
Background
A Research Scientist in Artificial Intelligence working within the DistributedAI group and GenAI team at Samsung AI Centre in Cambridge, UK. Her research is currently focused on efficient inference and fine-tuning of large language models and stable diffusion models in resource-constrained environments. In the past years, she has worked on a number of different machine learning paradigms including meta-learning, few-shot learning, and federated learning, and applications of these in different domains such as vision recognition and wireless mobile communications.