- Unsilencing colonial archives via automated entity recognition
- A map of Digital Humanities research across bibliographic data sources
- Archives and AI: An Overview of Current Debates and Future Perspectives
- Crypto art: A decentralized view
- On the Value of Wikipedia as a Gateway to the Web
- A Digital Reconstruction of a Large Plague Outbreak During 1630-1631
- The Citation Advantage of Linking Publications to Research Data
- Assessing the Impact of OCR Quality on Downstream NLP Tasks
Research Experience
- Launched the Centre for Digital and Computational Humanities at the University of Copenhagen.
- Using quantitative methods in humanities research.
- Developing machine learning applications for the automatic extraction of information from historical collections.
- This project focuses on understanding how knowledge develops and is exchanged in the Arts and Humanities.
- Studying Wikipedia as the largest encyclopedia to date: What sources is it based on? How is it used by the public? How can it be improved?
- Assessing how noise in digitized historical collections and information extracted from them can impact downstream tasks and applications.
Background
Professor of Digital and Computational Humanities at the University of Copenhagen, and Associate Professor of Computer Science at the University of Bologna. My work focuses on researching and developing Artificial Intelligence (AI) applications in the Arts and Humanities. I am also specialised in research engineering and quantitative methods in Humanities and Social Science research.