You've Purchased an AI Model. Now What?

Categories

Project Success

Practical considerations for implementing AI technology into clinical workflow were the focus of the closing session of the 2020 ACR Imaging Informatics Summit, “You’ve Purchase an AI Model. Now What?” The varied practice settings offered several workable strategies for implementing AI and ensuring a successful program.

Work with Vendors to Develop Implementation Solutions

Kicking off the presentations at the Summit, ACR Informatics Commission Chair Christoph Wald, MD, MBA, PhD, FACR, joked that “all of the data scientists in Boston had already been hired by MGH and Brigham.” So, the Lahey Hospital & Medical Center team — without a data scientist onsite — relied heavily on working relationships with a single AI vendor and a third-party workflow orchestrator to integrate AI algorithms into their clinical workflow. This approach aided in developing a context-sensitive widget to be deployed within the PACS viewer to alert radiologists to the presence of AI results on a given study. Dr. Wald emphasized the importance of collecting user (i.e., radiologist) feedback that is aligned with specific tools and conveyed to the internal quality assurance team as well as to the vendor.

Communicate with Radiologists and Staff

These points were further supported by the complementary presentations by Arun Krishnaraj, MD, MPH, and Cree Gaskin, MD, on their experience implementing AI models at the University of Virginia (UVA). Dr. Krishnaraj discussed their approach to implementing AI by first focusing on how a particular tool is mapped onto its intended use case to determine how it should integrate into the clinical workflow. (For more information on radiology AI use cases, please refer to the ACR Data Science Institute’s (DSI) Define-AI Directory of carefully curated use cases.)

According to Dr. Gaskin, working with UVA’s AI vendor was important to tune the implementation and presentation of results, including decisions regarding the precise timing of when the images were exposed to an algorithm and when the results are presented to the interpreting radiologist. Ultimately, UVA’s system was set up to alert radiologists to the arrival of new AI results after a report has been finalized and to facilitate integrated review of those results post hoc.

The UVA experience raised a critical point that clear communication with radiologists and staff throughout the process of developing and implementing AI into the clinical workflow is necessary to ensure success. Dr. Krishnaraj focused on the implementation of a lung CT de-noising algorithm to expand CT lung cancer screening to underserved populations in rural Virginia — a fantastic example of AI helping facilitate a public health initiative within their department. He found the biggest challenge in this project was “keeping everyone informed across multiple remote imaging sites during implementation.” Communication became even more important when an algorithm had to be taken out of the clinical workflow at UVA, due to the vendor’s decision to conform to an update to the FDA’s regulatory pathway.

Communication can also play a big role in mitigating the “expectation-reality mismatch” described by Jayashree Kalpathy-Cramer, PhD. In one example, the initial excitement around AI at UVA — particularly among the trainees and younger faculty — waned over time as radiologists were “underwhelmed” by the AI algorithm’s performance in clinical workflow. According to Dr. Kalpathy-Cramer, this is most often due to data heterogeneity and poor generalization of AI algorithms introduced to new sites after initial training and validation elsewhere. Though performance issues like these are well known in data science circles, communicating them to radiologists can help manage expectations for AI in the workflow.

Determine the Value Proposition

While academic departments like UVA’s have a mission to support research and the technological advancement of the field, private practices are more motivated to implement AI if there is a business case for it. At the Summit, Nina Kottler, MD, MS, of Radiology Partners addressed this issue; or, as she put it, “Who is going to pay for AI?”

Different actors in the radiology ecosystem have different incentives. For radiologists, particularly in private practice, efficiency is important enough that radiology practices might be willing to pay for productivity gains. Payors and hospitals have different incentives: For payors, decreasing costs is the priority, while hospitals strive to improve patient throughput. Dr. Kottler cautioned that “quality is an expected component of our product as radiologists,” so it may be hard to justify paying for AI that only targets improvements in quality.

Where Should a Practice Start?

Dr. Kottler was asked where to begin if your practice has not yet adopted AI. Harkening back to points made by Drs. Wald, Krishnaraj, and Gaskin earlier, Dr. Kottler emphasized that in selecting an AI technology, the focus should be on selecting a vendor based on their willingness and availability to work with your organization, rather than on selecting a specific algorithm. Dr. Wald reminded us that this is particularly important for practices without a data scientist onsite.

In follow-up to the previous question, a conference participant asked about how practices might go about trialing an algorithm from a vendor prior to committing to a contract. Laura Coombs, PhD, said the ACR AI-LAB™ platform Evaluate module will be able to provide this service. This is important because algorithms are notoriously brittle in environments different from where they were trained, and practices should attempt to "try before they buy" using their own data. The DSI has received interest from vendors in engaging in this service and the details are being worked out.
A final point to consider, panelists repeatedly said they believe imaging AI tools — in their current form — are not “ready” for permanent storage. They do not routinely store AI results in PACS, but rather store results in a separate archive for QA purposes. However, the results could indirectly become a part of the medical record if a radiologist acknowledges them in the report.

Finally, practices must recognize that algorithms can degrade over time and they will need a solution for longitudinal monitoring. The ACR AI-LAB platform’s Assess module links to the ACR AI Registry to collect longitudinal data regarding algorithm performance, as well as examine metadata, enabling practices to detect if — and how — an algorithm's performance degrades with time.

Key Takeaways

For those looking to incorporate AI into clinical workflow in the future the following keys to success emerged:
• Partner with AI and/or platform vendor(s) who are willing and available to work with you to develop the right implementation solution for your practice.
• Maintain open lines of communication with involved parties throughout the process.
• For private practices, focus on AI tools that fit a specific business case and educate your radiologists to keep them engaged with the technology.

 

Walter Wiggins, MD, PhD | Clinical Director, Duke Center for Artificial Intelligence in Radiology | Duke University Health System


You've Purchased an AI Model. Now What?

  • You may also like

    Solving Today's Informatics Problems, Building Tomorrow's Solutions

    As radiologists, we strive to deliver high-quality images for interpretation while maintaining patient safety, and to deliver accurate, concise reports that will inform patient care. We have improved image quality with advances in technology and attention to optimizing protocols. We have made a stronger commitment to patient safety, comfort, and satisfaction with research, communication, and education about contrast and radiation issues. But when it comes to radiology reports, little has changed over the past century.