News Local/State

Some Online Journals Will Publish Fake Science, For A Fee

 

Many online journals are ready to publish bad research in exchange for a credit card number.

That's the conclusion of an elaborate sting carried out by Science, a leading mainline journal. The result should trouble doctors, patients, policymakers and anyone who has a stake in the integrity of science (and who doesn't?).

The business model of these "predatory publishers" is a scientific version of those phishes from Nigerians who want help transferring a few million dollars into your bank account.

To find out just how common predatory publishing is, Science contributor John Bohannon sent a deliberately faked research article 305 times to online journals. More than half the journals that supposedly reviewed the fake paper accepted it.

"This sting operation," Bohannan writes, reveals "the contours of an emerging Wild West in academic publishing."

Online scientific journals are springing up at a great rate. There are thousands out there. Many, such as PLoS One, are totally respectable. This "open access" model is making good science more accessible than ever before, without making users pay the hefty subscription fees of traditional print journals.

(It should be noted that Science is among these legacy print journals, charging subscription fees and putting much of its online content behind a pay wall.)

But the Internet has also opened the door to clever imitators who collect fees from scientists eager to get published. "It's the equivalent of paying someone to publish your work on their blog," Bohannan tells Shots.

These sleazy journals often look legitimate. They bear titles like the American Journal of Polymer Science that closely resemble titles of respected journals. Their mastheads often contain the names of respectable-looking experts. But often it's all but impossible to tell who's really behind them or even where in the world they're located.

Bohannan says his experiment shows many of these online journals didn't notice fatal flaws in a paper that should be spotted by "anyone with more than high-school knowledge of chemistry." And in some cases, even when one of their reviewers pointed out mistakes, the journal accepted the paper anyway — and then asked for hundreds or thousands of dollars in publication fees from the author.

A journalist with an Oxford University PhD in molecular biology, Bohannan fabricated a paper purporting to discover a chemical extracted from lichen that kills cancer cells. Its authors were fake too — nonexistent researchers with African-sounding names based at the fictitious Wassee Institute of Medicine in Asmara, a city in Eritrea.

With help from collaborators at Harvard, Bohannan made the paper look as science-y as possible – but larded it with fundamental errors in method, data and conclusions.

For starters, the purported new cancer drug was tested on cancer cells – but not healthy cells. So there's no way to tell whether its effect was cancer-specific, or if it's simply toxic to all cells.

A graph in the paper purports to show that the more lichen drug that was added to test tubes of cancer cells, the more effective it was at killing. But in fact the actual data show no such difference.

Bohannan says it wasn't easy to write a convincing fake. Initially he made the data "too crazy," he says. His Harvard collaborators worried it made the paper look too interesting. "So we rewrote it, making boring rookie mistakes," he says.

The final touch was to make the paper read as though it had been written by someone whose first language is not English. To do that, Bohannan used Google Translate to put it into French, then translated that version back into English.

In the end, the paper's fictitious authors got 157 acceptance letters and 98 rejections – a score of 61 percent. "That's way higher than I expected," Bohannan says. "I was expecting 10 or 15 percent, or worst case, a quarter accepted."

For the privilege of being published, the paper's authors were asked to send along a publishing fee of up to $3,100.

The highest density of acceptances was from journals based in India, where academics are under intense pressure to publish in order to get promotions and bonuses.To learn the location of online journals that accepted or rejected Bohannan's paper, see this interactive global map.

Bohannan says the exercise is a damning indictment of the way peer review works (or doesn't) at many online journals. Peer review is the time-honored system of having outside experts comb through submissions to identify flaws in method, data or conclusions. It's the way scientific journals do quality control.

"Peer review is in a worse state than anyone guessed," he says.

Bohannan says he doesn't mean to suggest that the whole business model of online open-access journals is a failure. "You can't conclude that from my experiment, because I didn't do the right control – submitting a paper to paid-subscription journals," he says.

As he acknowledges, it's not as if peer review is always up to snuff at subscription journals – even the top subscription journals have been embarrassed by lapses in their peer review processes. But he says online publishing makes poor-quality journals easier to set up. And the sheer volume of online publications these days makes it harder to distinguish between legitimate and shady journals.

Jeffrey Beall of the University of Colorado wasn't surprised in the least by the outcome of Bohannan's sting. "He basically found what I've been saying for years," he tells Shots.

A growing number of online open-access journals "are accepting papers just to earn publishing fees, and as a result science is being poisoned by a lot of bad articles," Beall says.

Beall, a research librarian, is a self-appointed watchdog over open-access publishing. He maintains a list of what he calls "predatory publishers" – those who "exploit the open-access model of publishing for their own profit."

He points out that online publishers operate under an incentive that's just the opposite of traditional scientific journals. Print journals have rigid constraints on how many articles they can publish, so they have to screen out all but the best. And they have subscribers to keep happy, so they have to cultivate reputations as curators of high-quality research.

But online journals don't have to worry about subscribers; they make their money by charging contributors – who have a strong incentive to get published. So "the more papers they publish the more money they make," Beall says.

Two big questions arise out of all this: What damage is done by publish-anything journals? And what can be done about it?

The potential damage is both far-reaching and difficult to quantify. Bohannan points out that universities and government agencies, particularly in developing countries, may hire researchers based on resumes packed with sleazy citations. Determining which of those CV entries is high-quality and which aren't is no easy task.

Beall notes that lawyers often use scientific citations in briefs and trials. Government officials draw on published research to set policy. Drug companies have a strong incentive to manipulate research to bolster their claims. And researchers may be led down futile paths on the basis of poor research.

As to what can be done, Beall says poor-quality research can probably only be driven out by naming and shaming.

Bohannan thinks there might be a sort of Consumer Reports to survey the quality of online journals and call out those that fall short. And he thinks maybe such an enterprise might regularly carry out stings like his to keep everyone in the field on their toes.