by David B. Resnik, JD, PhD,
Since 1969, the New England Journal of Medicine has maintained a policy, known as the Ingelfinger Rule, against publishing articles which have already been published or reported to the media. Editor Franz Ingelfinger announced the policy to ensure that articles appearing in the journal would be original and to promote high quality science by deterring researchers from bypassing peer review and sharing their results directly with the public. Several editors have revisited the policy and modified it since then. Notable exceptions to the rule include: posters or abstracts reporting results at scientific meetings; information disclosed during testimony before government committees; research conducted or overseen by public health authorities, such as the National Institutes of Health or Centers for Disease Control and Prevention, that in their judgment warrants release to the media prior to submission for peer review because it addresses an urgent public health issue; and, under special arrangements, other research that addresses an urgent public health matter. The journal has also adopted a media embargo policy that complements the Ingelfinger Rule. Under the policy, journalists can have early access to forthcoming articles under that condition that they will not publish news stories on those articles until after they appear in the journal. Other biomedical journals, such as the Journal of the American Medical Association have policies similar to the Ingelfinger Rule.
Since the late 1980s, physicists, mathematicians, computer scientists, statisticians, and other researchers have posted preprints of articles on publicly available websites, such as arXiv, prior to submitting them for publication. The chief advantage of posting articles on preprint servers is that this practice facilitates rapid communication of new findings. It usually takes several months or more for an article submitted to a scientific journal to be published. Revisions and journal backlogs can add to this delay. Posting articles on preprint servers also allows researchers to get feedback from peers, who may comment on methodology, experimental design, data analysis and interpretation, and other aspects of the research. Comments are posted alongside the articles. Posting an article on a preprint server is generally recognized as establishing priority for scientific discoveries and innovations. Biomedical scientists, who have lagged behind their counterparts in the physical and mathematical sciences, have begun to take advantage of this form of exchange.
Although most scientific journals will now accept articles that have already been shared on preprint servers, journals that publish clinical research have been reluctant to adopt this stance, in part, because it conflicts with the Ingelfinger Rule. Likewise, clinical researchers have been wary of using preprint servers because they are concerned that this might preclude them from publishing in peer-reviewed journals.
In 1995 the editors of the New England Journal of Medicine considered publication issues related to the rapidly developing internet and concluded that posting an article on a preprint server would disqualify one from publishing it in the journal. The editors decided not to make an exception to the Ingelfinger Rule for posting preprints of articles because they believed this practice would bypass rigorous peer review and encourage the dissemination of information that is biased, erroneous, or even fraudulent. They were also concerned that patients might make unwise medical decisions, such not taking prescribed medications, as a result of accessing difficult to interpret information posted on preprint servers. Although the New England Journal of Medicine has not revised its policy regarding internet publication, it is worth noting that The Lancet now allows researchers to publish articles that have been previously posted on preprint serves.
Since the scientific publishing and electronic communication have changed considerably since the 1990s, it is worth reexamining the arguments against making an exception to the Ingelfinger Rule for preprints posted online.
Discouraging scientists from bypassing peer review and reporting their results directly to the media is still an important concern. Peer review helps to validate research by minimizing bias, error, methodological flaws, over-interpretation of data, and other problems. Though peer review is far from perfect—erroneous, biased, or irreproducible research is sometimes published—it is far superior to no review at all.
In some instances, bypassing peer review has increased the risk of sharing erroneous or biased research with the public. For example, in 1985 three French scientists held a press conference in which they claimed that cyclosporine was an effective treatment for acquired immunity deficiency syndrome (AIDS). The scientists never published evidence for this claim, which was soon discredited. In 1987, ICN Pharmaceuticals, Inc. held a press conference in which representatives of the company claimed that its antiviral drug ribavirin was effective at slowing the progression of the human immunodeficiency virus (HIV). Although stock in the company rose significantly, the Food and Drug Administration (FDA) found that the company’s claim was unwarranted.
Communicating erroneous or biased results to the public via the media can cause considerable harm to society and science. Patients may make poor decisions based on misinformation, latch onto false hopes, or become needlessly alarmed about baseless health risks. Scientists may waste time and money attempting to reproduce poorly described experiments. Fiascos involving claims of spectacular discoveries subsequently proven wrong can erode the public’s confidence in the research enterprise and support for science.
Although the argument against making an exception to the Ingelfinger Rule still has considerable merit, it is important to realize that scientific journals have very little power to prevent scientists from bypassing rigorous peer review. Scientists can publish press releases or post papers on their blogs, discussion forums, preprint servers, and various electronic outlets. For a fee, scientists can also disseminate their work in open access journals that guarantee rapid review and publication and have minimal standards of peer review, i.e. so called “predatory journals.” Additionally, the internet includes numerous websites containing ill-founded or fraudulent claims concerning medical products, services, treatments, and theories. No group of journals, let alone a single journal, can stop this flood of medical misinformation.
Also, preprint servers are not a dumping ground for low-quality science but offer a valuable form of peer review that can minimize errors and biases in research. Indeed, according to one study 73% of articles posted on preprint servers were eventually published in peer reviewed journals. Although the additional review provided by journals likely enhanced the quality of this research, this study indicates that the vast majority of articles posted on preprint servers probably represent high-quality research.
Given these critiques of the argument against making an exception to the Ingelfinger Rule, perhaps the best way to move forward would be for journals that publish clinical research to allow investigators to post their studies on validated preprint servers prior to submission. A validated server would be a website, such as arXiv, which is known to contain high-quality science. Validated servers should also include a disclaimer for patients looking for clinically-relevant information that would warn them that the research has not been reviewed by a scientific journal and should not serve as a basis for medical decision-making. The disclaimer would also advise patients to talk with their doctors about studies posted on the site that interest them.
Posting articles on preprint servers is the wave of the future and journals that publish clinical research should consider how they can abide by this trend while promoting rigorous peer review.
This research was supported by the intramural program of the National Institute of Environmental Health Sciences (NIH), National Institutes of Health (NIH). It does not represent the views of the NIEHS, NIH or US government.