Implicit reporting standards in bibliometric research: what can reviewers' comments tell us about reporting completeness?

📅 2025-08-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bibliometric studies suffer from inconsistent reporting standards, severely undermining their reliability, reproducibility, and transparency. To address this, we conducted a qualitative inductive coding analysis of 968 open peer-review comments drawn from 182 reviews, identifying 49 key reporting recommendations across three core dimensions—data, methods, and results—and systematically organizing them into 11 thematic categories and 68 subcategories. Comparative analysis with established guidelines (GLOBAL, PRIBA, BIBLIO) reveals that our recommendations cover 60–80% of existing items while significantly enhancing specificity, operationality, and scope. This study is the first to empirically derive reporting norms directly from actual peer-review practice, making implicit community expectations explicit. It delivers a practical, evidence-based refinement of reporting standards, filling critical gaps in current guidance and providing both an empirical foundation and actionable pathways for improving bibliometric research quality.

Technology Category

Application Category

📝 Abstract
The recent surge in bibliometric studies published has been accompanied by increasing diversity in the completeness of reporting these studies' details, affecting reliability, reproducibility, and robustness. Our study systematises the reporting of bibliometric research using open peer reviews. We examined 182 peer reviews of 85 bibliometric studies published in library and information science (LIS) journals and conference proceedings, and non-LIS journals. We extracted 968 reviewer comments and inductively classified them into 11 broad thematic categories and 68 sub-categories, determining that reviewers largely focus on the completeness and clarity of reporting data, methods, and results. We subsequently derived 49 recommendations for the details authors should report and compared them with the GLOBAL, PRIBA, and BIBLIO reporting guidelines to identify (dis)similarities in content. Our recommendations addressed 60-80% of the guidelines' items, while the guidelines covered 45-65% of our recommendations. Our recommendations provided greater range and specificity, but did not incorporate the functions of guidelines beyond addressing academic content. We argue that peer reviews provide valuable information for the development of future guidelines. Further, our recommendations can be read as the implicit community standards for reporting bibliometric studies and could be used by authors to aid complete and accurate reporting of their manuscripts.
Problem

Research questions and friction points this paper is trying to address.

Systematizing reporting completeness in bibliometric research
Identifying implicit standards through peer review analysis
Developing recommendations to improve study reproducibility
Innovation

Methods, ideas, or system contributions that make the work stand out.

Systematized reporting using open peer reviews
Derived recommendations from reviewer comments analysis
Compared recommendations with existing reporting guidelines
🔎 Similar Papers
No similar papers found.
Dimity Stephen
Dimity Stephen
German Centre for Higher Education Research and Science Studies
Psychologyresearch on researchscientometricspeer review
Alexander Schniedermann
Alexander Schniedermann
German Centre for Higher Education Research and Science Studies
Sociology of ScienceBibliometricsScientometricsSystematic ReviewsReporting Guidelines
A
Andrey Lovakov
German Centre for Higher Education Research and Science Studies, Berlin, Germany
M
Marion Schmidt
German Centre for Higher Education Research and Science Studies, Berlin, Germany
M
Matteo Ottaviani
German Centre for Higher Education Research and Science Studies, Berlin, Germany
N
Nikita Sorgatz
German Centre for Higher Education Research and Science Studies, Berlin, Germany; Robert K. Merton Center for Science Studies, Humboldt-Universität zu Berlin, Berlin, Germany
R
Roberto Cruz Romero
German Centre for Higher Education Research and Science Studies, Berlin, Germany
T
Torger Möller
German Centre for Higher Education Research and Science Studies, Berlin, Germany
V
Valeria Aman
German Centre for Higher Education Research and Science Studies, Berlin, Germany
S
Stephan Stahlschmidt
German Centre for Higher Education Research and Science Studies, Berlin, Germany; Unit of Computational Humanities and Social Sciences (U-CHASS), EC3 Research Group, University of Granada, Granada, Spain