Clinical Decision Support Systems: Pop-ups And Prescribing Practices

Author

Anne S. Zimmerman, JD, MS

Publish date

Clinical Decision Support Systems: Pop-ups And Prescribing Practices
Topic(s): Artificial Intelligence Clinical Ethics

Clinicians increasingly rely on AI-driven clinical decision support systems. One important facet of decision support systems is the pop-up notification. Clinicians see pop-ups in their electronic health records (EHR) systems. Pop-up alerts may add consistency to patient care and even promote safety, but they may also contribute to medicalization, overprescribing, and unsafe prescribing and be manipulated by fraudulent actors. The HITECH Act incentivized using EHR systems and called for efficiency, improved care, better patient engagement, better coordination of care, and improved population health.

Product landscape

EHR developers design user interfaces that appeal to busy clinicians. Clinical decision support software products have elements specific to prescribing, including reducing script writing to a few clicks, finding low prices, and simplifying insurance preauthorization. Safety pop-ups identify potential drug interactions and often alert the clinician to other potential hazards, although some apply only to a small patient population. The alerts range from helpful to irrelevant and lead to alert fatigue. One study found that “Clinicians ignore safety notifications between 49 percent and 96 percent of the time.” 

The pop-ups of concern here are prescribing alerts, which recommend prescription drugs based on AI models trained on big data to match patient symptoms or conditions with drugs. AI uses large datasets and machine learning for predicting, categorizing, and determining risk. 

Fraud 

Practice Fusion (now Veradigm) used pop-ups to recommend drugs. It provides its EHR platform to physicians for free and generates income through advertising. Practice Fusion told pharmaceutical companies its pop-ups could create financial benefits – clinicians would see the pop-ups, prescribe more, and that would lead to an overall increase in drug sales. Investigations revealed that Purdue Pharma paid Practice Fusion to create opioid-related alerts, which were specific to Purdue’s product, a long-release OxyContin. Practice Fusion’s software produced 230 million such alerts between 2016 and 2019. The US Attorney’s Office found that Practice Fusion accepted money for designing algorithms to lead to more frequent pop-ups. And it even allowed pharmaceutical companies to take part in AI development, wording the pop-ups to encourage prescribing. 

The payment from Purdue to Practice Fusion violated the Anti-Kickback Statute, which generally prohibits payments for referrals for any product or service reimbursable under a federal healthcare program. The violation led to a 2020 $145 million settlement against Practice Fusion. The US Attorney’s Office noted the pop-ups did not always comply with the standard of care. The settlement agreement covered the Ainti-Kickback felony counts; civil charges brought under the False Claims Act; civil charges brought by Health and Human Services departments, including the Centers for Medicare and Medicaid Services and the Office of the National Coordinator for Health Information Technology; and charges related to Practice Fusion’s similar arrangements with thirteen other companies. Additionally, in 2024, the US Attorney’s Office fined Practice Fusion employee Steven Mack personally $20,000 for obstructing the Practice Fusion investigation. He allegedly removed relevant documents from his computer related to his role in persuading Purdue to pay the kickback in exchange for allowing Purdue to design the clinician-facing user interface. He was also assigned community service for those impacted by opioid use, connecting defrauding the federal government to the harm of overprescribing dangerous drugs. In a twist, the successor to Practice Fusion is now pointing out the role of EHR systems in limiting overdose by prescription drug monitoring.

How do pop-ups relate to medicalization?

Medicalization is using a medical lens or approach to personal, behavioral, social, and emotional issues. Using a medical model to approach issues once clearly outside the purview of medicine often leads to prescription drug use and boosts sales of prescription drugs. When obesity was contextualized in eating and lifestyle, with its strong correlation to social and economic conditions, it was not treated with expensive drugs. Drugs may crowd out diet, exercise, and public policy, including tax policy, subsidies, education, and wages. Similarly, mild to moderate depression, anxiety, and attention issues were once considered benign characteristics, not medical conditions. 

The opioid epidemic is highly relevant to the Product Fusion case. However, arguably, there is also an ADHD epidemic, a depression epidemic, an anxiety epidemic, and many other psych disorders for which drugs are prescribed. There is also an obesity epidemic, with new drugs on the market. Pharmaceutical companies have strong incentives to influence pop-ups in favor of their brand or a drug they sell (even without the brand). In the Product Fusion case, “[t]he alert presented opioid prescription as an option on the same footing with other, evidence-based options such as exercise, cognitive behavioral therapy, and non-opioid analgesics for pain.” Due to automation bias, in rather unproven spaces, pop-ups designed to direct clinicians to prescribe may outweigh clinicians’ confidence in their own expertise or even overwhelm their gut feeling that a drug is not appropriate for a situation. 

Prescribing drugs for conditions once considered nonmedical, now prevalent, leads to a feedback loop. Data derived from people who have addressed or even resolved conditions without drugs may be largely absent, especially if they did not report their symptoms to doctors. Their data footprint may be smaller, and successful non-pharmaceutical alleviation of their conditions not properly accounted for in decision support systems. 

The kickbacks are not the only reason that decision-support systems steer clinicians toward prescribing. The software would direct clinicians to drugs (but not particular brands) even if pharmaceutical companies did not pay or bribe the software designers to create pop-ups that favor their products. A medicalized society has medicalized data. It is self-fulfilling. There is no way to elevate nonmedical data to the status of drugs because the development, features, and marketing reflect medicalization. For example:

  • The systems’ designers/software developers mine or purchase medical data.
  • The systems use machine learning; as a drug is prescribed more, prescribing may be recommended more often as algorithms “self-correct.”
  • The people designing decision support systems may themselves see drugs as first-line solutions or as easier to incorporate into the AI system than other types of data, for example, diet data.
  • Providing the systems to physicians free promotes ads, many of which are for pharmaceutical products rather than healthy lifestyles. 

Clinical decision support systems are not designed to promote healthy habits. They will not address the social determinants of health, lifestyle, or wellness effectively. However, they can incorporate health data from wearables. In the non-clinical context, apps use AI to positively influence diet and lifestyle, decreasing the risk of chronic disease. 

Pop-ups to prevent overprescribing and to promote reasonable alternatives

Pop-ups should include reminders to reevaluate whether a patient really needs to start or continue medication. Decision support system designers could build in features to mitigate the tendency to prescribe when conditions are not exclusively medical and to note better alternatives, including safer drugs, as in the opioid case. Because the pop-up results from AI rather than physician expertise or patient request, clinicians should not carelessly follow the recommendation. A wall between pharma and decision support system design – and continued enforcement of the prohibition on allowing pharma to word pop-ups – could improve patient trust in a highly automated medical environment.

Anne S. Zimmerman, JD, MS is the Founder and President of Modern Bioethics, Chair of the New York City Bar Association Bioethical Issues Committee, and Editor-in-Chief of Voices in Bioethics.

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.