DSI’s AI Survey Shows ACR Tools in Sync with Member Needs for Deploying AI

In April 2021, the Journal of the American College of Radiology published the results of the first ACR Data Science Institute® (ACR DSI) AI survey of ACR members. The survey was designed to help us understand how radiologists are using AI in clinical practice. With over 1,800 responses, it is, to our knowledge, the largest published survey of its kind in the U.S. For some, the results might be a bit surprising.

Despite the tremendous hype around AI over the past five years, our survey found that less than 30% of ACR members are using AI in their clinical workflows. What’s more, this number could be overstated since some respondents might consider their current breast CAD tools to be AI. While these survey results do not seem to justify the AI hype, a significant number of radiologists are currently using some form of AI now, and more than 25% of radiologists expect to purchase AI tools in the not-too-distant future. So, we believe radiologists need to continue to prepare themselves for a future with AI.

Key Survey Takeaways

• Despite the AI hype, there is room for growth: Just over 30% of radiologists are currently using AI as part of their practice.

• Many radiologists plan to purchase AI in the near term: Of practices not currently using AI, up to 25% plan to purchase AI tools in the next one to five years.

• Radiologists are using AI for a variety of tasks: The top uses of AI include image interpretation, worklist management, image enhancement, automated measurements, and departmental operations.

• Inconsistent AI performance is an issue: Inconsistent performance was observed by 94% of the survey respondents. Algorithm bias — whether patient, scanner, or conspicuity — was responsible for the majority of reported inconsistent AI performance.

• Radiologists want performance measures: Approximately 60% of respondents indicated they want some form of external validation of AI models across representative datasets, and an equal number indicated they would like to be able to assess the performance of an AI model on their own patient data before deploying it into their clinical workflows.

• Radiologists find value in using AI: While 95% of radiologists would not trust AI algorithms to run autonomously, most were satisfied with their overall experience and found AI provided value to them and their patients.

• A range of FDA-cleared algorithms are in use: Algorithms for screening mammography (9%), pulmonary embolus (6.4%), MR brain analytics (5.9%), and brain hemorrhage (5.7%) topped the list for most current users.

• Self-developed algorithms are popular: More of those using AI in clinical practice (9.8%) were using algorithms they created themselves than any single commercially developed algorithm.

Overcoming barriers to AI implementation
Our survey identified a number of barriers to AI implementation. When participants were asked what they need to adapt to a future with AI, most wanted to know that AI will work well in their practices prior to purchasing models. At the ACR DSI, our Evaluate-AI toolkit currently includes a catalog of FDA-cleared algorithms that provides users a summary of information provided by the developers during the FDA clearance process to help radiologists find and vet commercial products that might be a good fit for their practices.

The ACR is making significant upgrades to its image and data exchange platform (TRIAD®), currently used by almost all member sites for ACR Accreditation programs and/or research. As the new iteration of ACR Connect® is deployed, the Evaluate-AI module in the ACR AI-LAB™ will become fully functional. This will allow sites to use ACR Connect to search their image archives and assemble representative test cases to evaluate AI algorithms using their own data, either on site or in a secure cloud, to evaluate commercial AI models. Radiologists will also be able to use the ACR Assess-AI registry, which is part of the ACR's National Radiology Data Registry (NRDR®) program, to monitor longitudinal performance once the models are deployed into the clinical workflow. Finally, ACR ASSIST® modules (Assistants) are being developed for each of the ACR DSI's structured AI use cases, so that AI output can be more easily integrated into structured reporting systems. This enhances not only clinical integration but provides a platform for integrated AI performance monitoring. All of these tools have been developed to ensure radiologists are prepared for a future with AI so we can harness the power of AI to provide safe and effective care for our patients.

AI developers also need help
While AI developers were not part of the current ACR DSI AI survey, the ACR DSI continually engages with vendors and recently conducted a separate industry survey. We found that developers are interested in programs that will inform potential AI users about their products, such as the newly created ACR DSI catalog of FDA-cleared algorithms.

Developers are also interested in access to datasets that will provide multisite validation. As we continue to enhance ACR Connect over the coming months, we believe the ACR DSI Certify-AI can provide an opportunity for developers to validate their products across a diverse array of practice locations and types. Integration of real-world performance monitoring into the clinical workflow using the Assess-AI registry allows aggregation of data from multiple sites to provide developers with information they can use to monitor and improve the performance of their products.

Potential pathways for reimbursement for AI
When asked what ACR DSI can do to improve the potential of AI in medical imaging, over half of those who responded indicated that they would like to see pathways to fair reimbursement for implementing AI. Reimbursement pathways for clinical AI will be important for advancing AI into routine clinical use.

The ACR DSI is working with the ACR Commission on Economics to evaluate the best approaches to AI reimbursement. Currently the Medicare program has two potential pathways for AI reimbursement:
• The New Technology Add-On Payment (NTAP) can reimburse hospitals using certain AI models on a case-by-case basis.
• The Medicare Coverage of Innovative Technology (MCIT) could provide payment for AI models as soon as they are cleared by the FDA.

Both of these programs provide reimbursement that could jumpstart discussion of AI reimbursement through the traditional fee-for-service Medicare programs. Additionally, alternative payment models that include AI could eventually provide another avenue for radiologists to demonstrate value to the healthcare system.

In conclusion
While the survey results indicate a modest penetrance of AI in clinical practice at the present time, more than 25% of radiologists are looking to purchase AI in the future — and that number is likely to grow. Based on the survey results, the ACR is continuing our efforts to provide AI resources that members can use to demonstrate their ongoing value to our patients and health systems.

 

Bibb Allen, Jr., MD, FACR | ACR DSI Chief Medical Officer | Diagnostic Radiologist, Grandview Medical Center, Birmingham, AL


DSI’s AI Survey Shows ACR Tools in Sync with Member Needs for Deploying AI

  • You may also like

    SIIM-ACR Data Science Summit 2023: Exploring the Value of AI to Harness the Future

    As radiologists, we strive to deliver high-quality images for interpretation while maintaining patient safety, and to deliver accurate, concise reports that will inform patient care. We have improved image quality with advances in technology and attention to optimizing protocols. We have made a stronger commitment to patient safety, comfort, and satisfaction with research, communication, and education about contrast and radiation issues. But when it comes to radiology reports, little has changed over the past century.