Open Science, Counterscience, and the Limits of Trust

Author

Sabrina Derrington, MD, MA, HEC-C

Publish date

Open Science, Counterscience, and the Limits of Trust
Tag(s): Editor's pick
Topic(s): Ethics Research Ethics Social Justice

A recent New York Times investigative report by Mike McIntire describes how genetic and brain imaging data from thousands of U.S. children were used by a small group of researchers to advance a race science agenda. The data came from two large, federally funded studies—the Philadelphia Neurodevelopmental Cohort (PNC) and the Adolescent Brain Cognitive Development (ABCD) study—and were made available by the National Institutes of Health (NIH) to outside researchers in 2018 and 2020.

Dr. Bryan J. Pesta and colleagues analyzed data on home computers, outside of any official university role and without IRB oversight, and published numerous papers promoting the idea that there are biologically based, inherited differences in intelligence between races and ethnicities. The conclusions and figures from those papers spread through message boards, internet blogs, and various forms of social media to support white supremacist arguments. Even after these researchers faced sanctions and substantial ethical and scientific criticism, their work continued to spread and was cited by artificial intelligence platforms responding to queries about race and intelligence. This case is an example of successful Citizen Scientific Racism (CSR): the use of scientific methods and language to lend legitimacy to racist ideologies outside established systems of accountability. CSR is a particularly malignant example of counterscience that exposes a fundamental vulnerability of Open Science: it is a system built on assumptions of shared values, mutual accountability, and trustworthiness, but it can be misused to promote discriminatory and false ideals.

Open Science is a Vulnerable System

NIH strategic plans and policies strongly endorse Open Science and FAIR (findability, accessibility, interoperability, and reusability) principles for data stewardship. Researchers who collect large‑scale genomic data with federal funding are generally required to share de‑identified data through NIH‑managed repositories. The aim is to accelerate scientific discovery and improve health and medical care.

Study teams may request specific data use limits to ensure that future uses of acquired data align with the original informed consent. Researchers requesting data from controlled-access databases must explain their research goals and must promise to avoid stigmatizing research and adhere to responsible research practices.

In this case, those safeguards proved insufficient. The New York Times investigation points out significant weaknesses in NIH oversight, including inadequate scrutiny of applications to access data from NIH-controlled databases, overreliance on “good faith” compliance with policies prohibiting stigmatizing research, and failure to respond to reported violations in a timely manner. Dr. Pesta and colleagues were able to exploit these weaknesses to obtain access and use the data in ways that directly conflicted with the values and expectations of the families who had agreed to participate, families who trusted the study team, their institution, and the NIH to protect their data and use it for the greater good.

What those families may not have realized is that they were also placing their trust in the broader scientific community, and in society more generally.  The NIH is working to strengthen protections against inappropriate data release, but those protections exist in tension with data accessibility. If a major consideration in allowing access involves determining whether the researcher is legitimate, whether their research question is appropriate, then the system is vulnerable to subjectivity, bias, and political change.

Distrust Fuels Counterscience

One of the most concerning aspects of the misuse of PNC and ABCD data is the way that Dr. Pesta and colleagues continue to justify their actions, framing their work as the courageous pursuit of scientific knowledge in a battle against “woke condemnation,” ideological oppression, and constraints on academic freedom. These arguments are eerily similar to those used to justify the disruptive overhaul and politicization of government health agencies like the NIH, the Centers for Disease Control and Prevention, and the Federal Drug Administration. The very structures meant to safeguard scientific integrity and protect patients and research subjects from harm have been characterized by the Trump administration and the Make America Healthy Again (MAHA) coalition as classist institutions prioritizing liberal agendas, silencing outsiders, retarding scientific advancement, and preventing citizens from accessing innovative therapies.

In this same vein, in April 2025, a federal prosecutor at the Department of Justice sent a series of letters to top U.S. medical journals accusing them of biased publishing practices, being “partisans in various scientific debates,” and inquiring as to how the journals handle articles with “competing viewpoints.” Dr. Pesta claims that his research was censored because he was asking unpopular “dangerous” questions, and not because of any flaws in the research. And as the New York Times article points out, Dr. Pesta has reason to be encouraged, since the current political administration is enabling the active rejection of established scientific knowledge (e.g., the safety and efficacy of immunizations) and redirecting funding and regulatory protections towards research that aligns with their priorities – which include eliminating diversity, equity, and inclusion policies, research, and education.

Research priorities for federal funding often shift under the influence of different presidents. But the answers to questions about the proper aims of science, what research questions should be asked, and whether some types of research ought not be pursued should not depend on which political party is in power. These are normative ethical questions that ought to be considered within the historical and current social context, informed by an understanding of scientific rigor and validity, as well as bioethics and moral philosophy, for the answers will have a significant impact on human flourishing, and ultimately reflect the values of that society itself.

Towards a Shared Moral Vision for Open Science

In our current day of increasing political polarization and lack of social cohesion, a just Open Science can only be achieved through broader, more inclusive conversations that acknowledge the root causes of distrust in our fellow citizens and institutions, that seek to articulate shared values, and that center the inherent human dignity and value of all people. Achieving a shared moral vision of Open Science requires societal and institutional reforms in partnership with historically marginalized populations. Until then, the only mechanism that can ensure that a research participant’s data is used in accordance with their values is for the participant to retain ownership and control of their data. This is Genomic Dignity, the idea that all people have the right to own their genomic data, the right to control when, for how long, with whom, and for what purposes to share it, and the right to an auditable trail of its use. This level of ownership and control is possible through encryption and watermarking techniques that are currently being developed and tested. Other safeguards include requiring researchers to conduct analyses within a centralized computational environment, such as the All of Us researcher workbench, and limiting or completely eliminating downloading of data from the system. There is much work to be done to earn the trust of research subjects, particularly those from marginalized populations. Genetics researchers must exercise social responsibility to consider and prevent the potential for harmful misuse of their data and analyses. And yet they cannot and should not bear that burden alone. Institutions, funders, policymakers, and the broader public must share this responsibility if Open Science is to advance the common good rather than contribute to harm.

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.