Academic journal deja vu

Author

sysadmin

Publish date

Tag(s): Archive post Legacy post
Topic(s): Uncategorized

Mounir Errami and Harold Garner — both of the University of Texas Southwestern Medical Center — report in a Nature commentary this week that duplication and plagiarism appear to be on the rise in biomedical academic publishing. After using a text analyzer on a sample of Medline abstracts, they found that as many as 200,000 articles in the Medline database could be duplicates. (They’ve posted a large group of the potential duplicates in a database so that others can pick over them.) Errami and Garner report that many of these duplicates appear to be articles that were submitted to different journals at the same time. And they also found that the people who appear to be ripping off other academics often seem to be serial offenders.

This may seem like a problem that only concerns academia, but as Errami and Garner note, there are implications for the general public:

In general, duplicates are often published in journals with lower impact factors (undoubtedly at least in part to minimize the odds of detection) but this does not prevent negative consequences especially in clinical research. Duplication, particularly of the results of patient trials, can negatively affect the practice of medicine, as it can instill a false sense of confidence regarding the efficacy and safety of new drugs and procedures. There are very good reasons why multiple independent studies are required before a new medical practice makes it into the clinic, and duplicate publication subverts that crucial quality control (not to mention defrauding the original authors and journals).

So, what should be done about this problem? Errami and Garner call for more vigiliance by journal editors and better use of technology:

If journal editors were to use more frequently the new computational tools to detect incidents of duplicate publication and advertise that they will do so much of the problem is likely to take care of itself. We find it odd that automated text-matching systems are used regularly by high schools and universities, thereby enabling us to hold our children up to a higher standard than we do our scientists. In our view, it would be fairly simple to fold these tools into electronic-manuscript submission systems, making it a ubiquitous aspect of the publication process.

Although text-comparison algorithms have come a long way in the last decade, they are still in their infancy, and experience with student software shows that as tools to detect duplicate publication improve, determined and skilled cheats will find ways to defeat them. But as in any arms race, the winners are usually determined by the costbenefit balance, and the costs entailed in unethical duplication practices will quickly rise to a level that makes them prohibitively expensive to all but the most desperate (or most skilled) practitioners.

(via)

-Greg Dahlmann

Earlier on blog.bioethics.net:
+ Is your professor juicing?
+ Glenn McGee in The Scientist: Me First!

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.