Peer reviewers more likely to approve articles that cite their own work, study finds

The study reviewed 37,000 articles, of which 5,000 articles cited a reviewer and about 2,300 reviews explicitly requested a citation

Author

Published on :
Share:
study

Author

Reviewers are more inclined to approve scientific articles that cite their own research, according to a new study of thousands of peer-reviewed manuscripts published in the journal Nature.

The study, which examined 18,400 articles from four open-access platforms, found that “reviewers who were cited were more likely to approve the article after the first review than were reviewers who were not cited.”

The work, posted earlier this month as a preprint, was led by Adrian Barnett, who researches peer review and meta-research at Queensland University of Technology in Brisbane, Australia. He said the project was inspired by “anecdotes from authors who cited articles only because reviewers asked them to.”

“Sometimes, these requests are fine,” Barnett explained, as quoted by the Nature. “But if reviewers ask for too many citations or the reason to cite their work is not justified, the peer-review process can become transactional.” Citations, he noted, boost a researcher’s h-index, a key metric of academic impact.

5,000 of 37,000 articles cited a reviewer

The analysis drew on reviews from F1000Research, Wellcome Open Research, Gates Open Research and Open Research Europe, platforms that make all versions of their articles and reviewer comments public. Of 37,000 reviews studied, almost 5,000 articles cited a reviewer and about 2,300 reviews explicitly requested a citation.

Study

 

The data showed that in cases where a reviewer asked to be cited, “92% of reviewers who were cited in version 2 recommended approval compared with 76% for reviewers who were not cited.” However, Barnett added, reviewers requesting citations were also “about half as likely to approve the article than reject it or express reservations” on the first pass.

To explore whether coercive language was being used, Barnett analysed 2,700 reviewer comments. He found that reviewers requesting citations were more likely to use words such as “need” or “please” when rejecting an article, a pattern he says “suggests that coercive language was used.”

Not everyone agrees. Jan Feld, a metascience researcher at Victoria University of Wellington, New Zealand, questioned whether such language necessarily signals coercion. “That seems like a bit of a stretch,” the Nature quotes him as having said. “I cannot recommend publication if the paper has not improved or I still have concerns.” Feld acknowledged that reviewers sometimes request citations that are not warranted, but added that they “can recommend citations, including of their own work, to address issues they’ve identified.”

Barnett says the findings highlight how subtle pressures in peer review may shape what gets published. “When a reviewer rejects a paper,” he explained, “authors might opt for the path of least resistance and include the citation to get their paper accepted.”

Balazs Aczel, a psychologist at Eötvös Loránd University in Budapest, said the study stands out for its scale. “The number of peer reviews included and level of analysis is novel,” he said, noting that a lack of data sharing by publishers has made the practice difficult to study.

 

 

Also read: IIT study reveals collagen’s hidden role in worsening type 2 diabetes; should we be worried? 

(Do you have a health-related claim that you would like us to fact-check? Send it to us, and we will fact-check it for you! You can send it on WhatsApp at +91-9311223141, mail us at hello@firstcheck.in, or click here to submit it online)

 

Author