top of page

Reliability and the Effect of Bibliometrics

Issue #52

Impacts of Technology Adoption on Various Users in a Digital Era

Ivan Oransky, Adam Marcus, and Alison Abritis wrote an article on 17 August 2023 in a journal called the BMJ with the title “How bibliometrics and school rankings reward unreliable science.”¹ No one should doubt the impact of these statistics: “Bibliometrics and school rankings are largely based on publications and citations. Take the Times Higher Education rankings, for example, in which citations and papers count for more than a third of the total score.² Or the Shanghai Ranking, 60% of which is determined by publications and highly cited researchers.³ The QS Rankings count citations per faculty as a relatively low 20%.⁴ But the US News Best Global Universities ranking counts publication and citation related metrics as 60%.⁵” ¹

The authors argue that citations are “simple to game” and give examples.¹ Some universities pay bonuses for articles in ranked journals, or hire productive authors who will “count toward the universities’ rankings.”¹ “Journals have been found to encourage, or even require, authors to cite other work in the same periodical.”⁶ Finding a solution is not simple and “cannot succeed without tackling the incentives themselves. A good place to start is by deflating the importance of citations in the promotion, funding, and hiring of scientists.”¹ The authors are not alone in suggesting this. “The Declaration on Research Assessment (DORA)⁷ and the Leiden Manifesto for research metrics⁸ recommend not considering impact factors when conducting such assessments—and while thousands of institutions have signed on, very few walk the walk.”⁹

Bibliometrics is a product of the library and information science world that non-scholars have co-opted in order to have a clear and simple basis for judgments about funding. Eugene Garfield, the founder of Bibliometrics, warned about their misuse in an essay on 28 October 1985.¹⁰ For administrators, bibliometrics have become a way to avoid reading papers. In the past many notable scholars had irregular publication records because they used the time to think without the pressure to publish. Thinking is ideally what an academic is paid to do, but only those with unassailable positions at serious universities can take the risk. For many academics, taking the time just to think can be hazardous for their future.


1: Ivan Oransky, Adam Marcus, and Alison Abritis, ‘How Bibliometrics and School Rankings Reward Unreliable Science’, BMJ 382 (17 August 2023): p1887,

2: World University Rankings 2023: methodology. Times Higher Educ 2022 Oct 5.

3: Ranking S. Shanghai Ranking’s Academic Ranking of World Universities Methodology 2022. 2023.

4: QS Quacquarelli Symonds. 2024 Rankings Cycle. 2023.

5: Morse R, Wellington S. How US News Calculated the 2022-2023 Best Global Universities Rankings. US News 2022 Oct 24.

6: Ferguson C. Journal stops asking authors to stack citations following Retraction Watch post. Retraction Watch 2015 Feb 23.

7: Declaration on Research Assessment. What is DORA?

8: Hicks D, Wouters P, Waltman L, de Rijcke S, Rafols I. Bibliometrics: The Leiden Manifesto for research metrics. Nature 2015; 520:429-31. doi:10.1038/520429a. Pmid:25903611

9: Curry S. Let’s move beyond the rhetoric: it’s time to change how we judge research. Nature 2018;554:147. doi:10.1038/d41586-018-01642-w

10: Essays of an Information Scientist, Vol:8, p.403-409, 1985. Current Comments, #43, p.3-9, October 28, 1985.


Recent Posts

See All
bottom of page