Applying clinical decision support design best practices with the practical robust implementation and sustainability model versus reliance on commercially available clinical decision support tools: Randomized controlled trial

Katy E. Trinkley, Miranda E. Kroehl, Michael G. Kahn, Larry A. Allen, Tellen D. Bennett, Gary Hale, Heather Haugen, Simeon Heckman, David P. Kao, Janet Kim, Daniel M. Matlock, Daniel C. Malone, L. Robert, Jessica Stine, Krithika Suresh, Lauren Wells, Chen Tan Lin

Research output: Contribution to journalArticlepeer-review

Abstract

Background: Limited consideration of clinical decision support (CDS) design best practices, such as a user-centered design, is often cited as a key barrier to CDS adoption and effectiveness. The application of CDS best practices is resource intensive; thus, institutions often rely on commercially available CDS tools that are created to meet the generalized needs of many institutions and are not user centered. Beyond resource availability, insufficient guidance on how to address key aspects of implementation, such as contextual factors, may also limit the application of CDS best practices. An implementation science (IS) framework could provide needed guidance and increase the reproducibility of CDS implementations. Objective: This study aims to compare the effectiveness of an enhanced CDS tool informed by CDS best practices and an IS framework with a generic, commercially available CDS tool. Methods: We conducted an explanatory sequential mixed methods study. An IS-enhanced and commercial CDS alert were compared in a cluster randomized trial across 28 primary care clinics. Both alerts aimed to improve beta-blocker prescribing for heart failure. The enhanced alert was informed by CDS best practices and the Practical, Robust, Implementation, and Sustainability Model (PRISM) IS framework, whereas the commercial alert followed vendor-supplied specifications. Following PRISM, the enhanced alert was informed by iterative, multilevel stakeholder input and the dynamic interactions of the internal and external environment. Outcomes aligned with PRISM’s evaluation measures, including patient reach, clinician adoption, and changes in prescribing behavior. Clinicians exposed to each alert were interviewed to identify design features that might influence adoption. The interviews were analyzed using a thematic approach. Results: Between March 15 and August 23, 2019, the enhanced alert fired for 61 patients (106 alerts, 87 clinicians) and the commercial alert fired for 26 patients (59 alerts, 31 clinicians). The adoption and effectiveness of the enhanced alert were significantly higher than those of the commercial alert (62% vs 29% alerts adopted, P<.001; 14% vs 0% changed prescribing, P=.006). Of the 21 clinicians interviewed, most stated that they preferred the enhanced alert. Conclusions: The results of this study suggest that applying CDS best practices with an IS framework to create CDS tools improves implementation success compared with a commercially available tool.

Original languageEnglish (US)
Article numbere24359
JournalJMIR Medical Informatics
Volume9
Issue number3
DOIs
StatePublished - Mar 2021
Externally publishedYes

Keywords

  • Clinical decision support systems
  • Congestive heart failure
  • Implementation science
  • PRISM
  • RE-AIM

ASJC Scopus subject areas

  • Health Informatics
  • Health Information Management

Fingerprint

Dive into the research topics of 'Applying clinical decision support design best practices with the practical robust implementation and sustainability model versus reliance on commercially available clinical decision support tools: Randomized controlled trial'. Together they form a unique fingerprint.

Cite this