Abstract

Altmetrics are new bibliometric methods with increasing popularity, meant to reveal the impact and public attention for scientific research. In addition to the traditional bibliometric scores (such as article citation counts, h-index, or journal impact factor), any research paper published online can have algorithm-generated Altmetric scores linked to a digital identifier (for example, DOI-Digital Object Identifier). In this article, we have analyzed five papers withdrawn from publication: 4 papers with the highest citation scores until December 2020 according to the Retractionwatch.com database and one COVID-19-related paper with the highest Altmetric score. By contrasting traditional bibliometrics according to the Web of Science database (like article citation counts before and after retraction) with Altmetric details (like Attention score, Mendeley mentions, Twitter mentions, News outlets) we have shown that the highest citation scores for bibliometrics were not reflected in Altmetric scores and vice-a-versa. We observed that all the cases of retracted papers with the highest citation counts came from the top 25% most appreciated scientific journals. Moreover, these articles attracted citations even long after retraction from publication, and if these articles were to be selected for the creation and application of secondary scientific literature, we fear it could lead to inaccurate results. The Attention score, also known as the Altmetric score, is meant to complement traditional bibliometrics and produce a more complex evaluation for scientific work (a scientific assessment and a social interest/ impact assessment).

Keywords

Altmetric, Bibliometric, Retracted article, New score