CHICAGO, ILLINOIS—Once you’ve submitted your paper to a journal, how important is it that the reviewers know who wrote it?
Surveys have suggested that many researchers would prefer anonymity because they think it would result in a more impartial assessment of their manuscript. But a new study by the Nature Publishing Group (NPG) in London shows that only one in eight authors actually chose to have their reviewers blinded when given the option. The study, presented here at the Eighth International Congress on Peer Review, also found that papers submitted for double-blind review are far less likely to be accepted.
Most papers are reviewed in single-blind fashion—that is, the reviewers know who the authors are, but not vice versa. In theory, that knowledge allows them to exercise a conscious or unconscious bias against researchers from certain countries, ethnic minorities, or women, and be kinder to people who are already well-known in their field. Double-blind reviews, the argument goes, would remove those prejudices. A 2007 study of Behavioral Ecology found that the journal published more articles by female authors when using double-blind reviews. In a survey of more than 4000 researchers published in 2013, three-quarters said they thought double-blind review is “the most effective method.”
But that approach also has drawbacks. Journals have checklists for authors on how to make a manuscript anonymous by avoiding phrases like “we previously showed” and by removing certain types of meta-information from computer files—but some researchers say they find it almost impossible to ensure complete anonymity.
“If I am going to remove every trace that could identify myself and my coauthors there wouldn’t be much left of the paper,” music researcher Alexander Jensenius from the University of Oslo wrote on his blog. Indeed, experience shows that reviewers can sometimes tell who wrote a paper, based on previous work or other information. At Conservation Biology, which switched to double-blind reviews in 2014, reviewers who make a guess get it right about half of the time, says the journal’s editor, Mark Burgman of Imperial College London. “But that’s not the end of the world,” he says. Double-blind review, he says, “sends a message that you’re determined to try and circumvent any unconscious bias in the review process.”
In 2013 NPG began offering its authors anonymous peer review as an option for two journals, Nature Geoscience and Nature Climate Change. Only one in five authors requested it, Naturereported 2 years later—far less than editors had expected. But the authors’ reactions were so positive that NPG decided to expand the option to all of its journals.
At the peer review congress last week, NPG’s Elisa De Ranieri presented data on 106,373 submissions to the group’s 25 Nature-branded journals between March 2015 and February 2017. In only 12% of cases did the authors opt for double-blind review. They chose double-blind reviews most often for papers in the group’s most prestigious journal, Nature (14%), compared to 12% for Nature “sister journals” and 9% for the open-access journal Nature Communications.
The data suggest that concerns about possible discrimination may have been a factor. Some 32% of Indian authors and 22% of Chinese authors opted for double-blind review, compared with only 8% of authors from France and 7% from the United States. The option was also more popular among researchers from less prestigious institutes, based on their 2016 Times Higher Educationrankings. There was no difference in the choices of men and women, De Ranieri noted, a finding that she called surprising.
Burgman suspects that the demand for double-blind review is suppressed by fears that it could backfire on the author. “There’s the idea that if you go double blind, you have something to hide,” he says. That may also explain why women were not more likely to demand double blind reviews than men, he says. Burgman says he thinks making double-blind reviews the standard, as Conservation Biology has done, is the best course. “It has not markedly changed the kind or numbers of submissions we receive,” he says. “But we do get informal feedback from a lot of people who say: ‘This is a great thing.’”
Authors choosing double-blind review in hope of improving their chances of success will be disappointed by the Nature study. Only 8% of those papers were actually sent out for review after being submitted, compared to 23% of those opting for single-blind review. (Nature’s editors decide whether to send a paper for review or simply reject it, and the editors know the identity of the authors.) And only 25% of papers under double-blind review were eventually accepted, versus 44% for papers that went the single-blind route.