Published several papers, including 'External Validation of Predictive Models for Diagnosis, Management and Severity of Pediatric Appendicitis' (medRxiv) and 'What Does Evaluation of Explainable Artificial Intelligence Actually Tell Us? A Case for Compositional and Contextual Validation of XAI Building Blocks' (Extended Abstracts of CHI). Additionally, designed and led the development of the FAT Forensics open-source toolkit.
Research Experience
Joined the Medical Data Science group as a Research Fellow in May 2023. Prior to this, he was a Research Fellow at the ARC Centre of Excellence for Automated Decision-Making and Society, affiliated with RMIT University in Melbourne, Australia. He also held numerous research positions at the University of Bristol, working on projects such as REFrAMe, SPHERE, and the European Union's AI Research Excellence Centre TAILOR.
Education
Holds a Master's degree in Mathematics and Computer Science and a doctorate in Computer Science from the University of Bristol, United Kingdom.
Background
Main research focus is on the transparency – interpretability and explainability – of data-driven predictive systems based on artificial intelligence and machine learning algorithms. Previous work includes enhancing the transparency of predictive models with feasible and actionable counterfactual explanations and robust modular surrogate explainers.