🤖 AI Summary
This study investigates the state of research transparency and reproducibility in human-subject and technical-implementation papers published at the Hawaii International Conference on System Sciences (HICSS) from 2017 to 2024—5,579 papers in total—with a focus on public sharing of data, code, and other supplementary materials.
Method: We employed a hybrid approach combining automated text mining for initial screening and manual verification to assess the availability and accessibility of external repositories (e.g., GitHub) for 2,028 target papers.
Contribution/Results: Only 3% of examined papers provided fully functional, publicly accessible repositories—revealing a severe deficit in transparency practices. To our knowledge, this is the first comprehensive, longitudinal, and census-level empirical audit of resource availability at HICSS. The audit framework and analytical tools developed herein are fully open-sourced, establishing a methodological foundation and benchmark evidence to advance reproducibility in the information systems discipline.
📝 Abstract
Every day, new discoveries are made by researchers from all across the globe and fields. HICSS is a flagship venue to present and discuss such scientific advances. Yet, the activities carried out for any given research can hardly be fully contained in a single document of a few pages-the "paper." Indeed, any given study entails data, artifacts, or other material that is crucial to truly appreciate the contributions claimed in the corresponding paper. External repositories (e.g., GitHub) are a convenient tool to store all such resources so that future work can freely observe and build upon them -- thereby improving transparency and promoting reproducibility of research as a whole. In this work, we scrutinize the extent to which papers recently accepted to HICSS leverage such repositories to provide supplementary material. To this end, we collect all the 5579 papers included in HICSS proceedings from 2017-2024. Then, we identify those entailing either human subject research (850) or technical implementations (737), or both (147). Finally, we review their text, examining how many include a link to an external repository-and, inspect its contents. Overall, out of 2028 papers, only 3% have a functional and publicly available repository that is usable by downstream research. We release all our tools.