AI Use Cases 101: What Radiologists Need to Know

While computer science experts understand how to train computers, it falls to radiologists to help AI developers understand which problems need to be solved to improve patient care.

Working to solve problems that don't exist is a waste of everyone's time. Preventing that fruitless pursuit is a key goal of the ACR Data Science Institute (DSI) Data Science Subspecialty panels.

Use cases are the mechanism by which DSI communicates to the AI developer community the tasks AI could perform that would be useful to radiologists and could improve patient care. More specifically, a use case is a narrative description and flow chart defining the goal of an AI algorithm — including required clinical inputs — and describing how the algorithm would best integrate into radiologists’ workflows alongside other radiology tools.

Why Develop Use Cases?

Why should radiologists care about AI use cases? Because, while computer science experts understand how to train computers to process images, they often do not understand which parts of a radiologic study are most relevant to patient care. As radiologists, we take this knowledge for granted. For someone with no medical background, however, it is next to impossible to determine what is clinically relevant without some guidance.

Clinical relevancy may involve identification of specific imaging findings, accurate quantification of imaging biomarkers, classification of pathological conditions, or a change in appearance from an earlier study to a later study. Even when algorithms are capable of identifying a clinically relevant finding or pathological condition, it is important to ask: Is this a problem worth solving?

While an AI algorithm could train a computer to count the fingers on a hand radiograph, there would be no benefit to the radiologist reading the study. A better alternative would be a narrow AI system that could quietly evaluate every visible scaphoid bone on a hand or wrist radiograph, then alert the radiologist when it detects a fracture.

An algorithm that supports detection and classification of certain fractures also would be a welcome addition to most radiologists’ armamentarium, if it is integrated with existing systems and doesn’t hinder the workflow.

To be accepted by the market and provide value-added patient care, an algorithm will need to be clinically useful and readily integrated into radiologists’ standard workflow. And that’s where the ACR DSI Data Science Subspecialty Panels come in.

How Are DSI Use Cases Novel?

ACR formed the DSI in 2017 with the goal of creating a framework for implementation of machine learning in the radiological professions. From the start, DSI set out to define clinically relevant use cases for the development of AI algorithms in medical imaging, interventional radiology, and radiation oncology. Data Science Subspecialty Panels were formed early on to begin considering a broad range of possible use cases for AI and make recommendations on their potential impact.

ACR DSI use cases are detailed specifications that allow developers to create AI algorithms to assist radiology professionals in disease detection, characterization, and treatment. They are built around a framework that brings radiology professionals, the developer community, and other stakeholders together to develop, validate, deploy, and monitor AI algorithms in medical imaging and the radiological sciences.

The panels are comprised of radiologists with diverse backgrounds and include both academic and private practice radiologists. Panel members collaborate to identify relevant AI use cases and prioritize them for use by developers to build relevant AI algorithms. Some panels, including the musculoskeletal panel, include a radiologist who spends a significant portion of time in industry. This panel member understands the perspective of a radiology software/services vendor relative to which projects are feasible and which are unrealistic at this point in AI development.

A year ago, when I heard that the ACR was entering this space, I was eager to get involved. I contacted the College and asked to be considered for a panel and was subsequently chosen for a musculoskeletal panel.

How Do the Use Case Panels Work?

The dozens of volunteer experts recruited by DSI last year to begin developing use cases were tasked with creating a usable framework for AI algorithm developers, then building on that framework to create clinically relevant use cases. Panel volunteers included physicians, medical physicists, data scientists, and software engineers, among others.

Once the panels were assembled, each radiologist was asked to submit two or three problems from their practices that might be amenable to an AI solution. Panelists then made “elevator pitches” for their concepts during panel meetings and other panel members provided comments.

At this early stage, ideas for potential use cases were only discarded if the panel felt they were not within the scope of the specific panel (for example, musculoskeletal radiology) or were not considered to need specific input from radiologists. The vast majority of use cases that were proposed moved forward to the next round of review. Details of concepts were then put into templates provided by the DSI.

During panel conference calls over the next few months, individuals presented their draft concepts and received feedback on various aspects of each proposed use case.

The musculoskeletal panel considered many factors in drafting our first set of use cases. For example: Should children be excluded from the training sets? Should developers be told to collect the patient weight, when available, as it would likely affect the appearance of weight bearing images?

After our calls, DSI staff incorporated changes to the drafts and sent them on to the presenting panel member for editing. Our panel also maintained a dynamic list of “elevator pitches” from which panel members can draw future draft use cases. Working together remotely in this way, our panel was able to complete over a dozen use cases. These are included with the 50 use cases DSI is released to AI algorithm developers in October.

What’s on the Horizon for Use Cases?

To achieve success, our panels will rely on a broad array of stakeholders — including individual ACR members, academic departments, other radiology societies, and the developer community itself — to submit fresh ideas for use cases. This will help keep the AI engine running and create a best-in-class directory of hundreds of AI use cases.

While AI garners much hype in the press — and we are all with familiar forecasts of near-term automation of radiology — predictions about AI are usually made by people with very little understanding of the complexity of medical imaging interpretation. I share the belief of the DSI that AI will instead be a great tool for radiologists, so long as radiologists and DSI effectively communicate our specific needs to AI vendors.

Flexibility will also be key. It is likely that the process and work products of panels will need to evolve as the use cases become available to members of industry. It’s quite possible that the AI frameworks we are using now will seem basic in five years. That evolution will happen once use cases grow to include methods for labeling images, a labeled image marketplace, and a pathway for the ACR to work with the developer community to obtain regulatory clearance or reimbursement from payers.

History can be a great teacher when facing change. The first autopilot systems for airplanes were created in 1912 by the Sperry Corporation. If pilots of that time had collectively disengaged out of fear of their profession being replaced, I am sure the airline industry would look remarkably different today. Instead, over the next 100 years, advancements in automating menial tasks and continually developing safety protocols has led to an industry with an impeccable safety record — one to which other industries aspire.

As radiologists, we can hope for a similar result and work to make it happen by embracing AI and playing an active role in its development. If you would like to get involved, consider commenting on one of the use cases released in October by DSI. Or suggest a new problem that may be amenable to an AI solution. Volunteering for a use case panel in your specialty is another great option. We welcome your support in creating more use cases for DSI over the next few years. Now is the time to get involved in developing AI applications that will help radiology professionals provide improved medical care.


Jay W. Patti, MD, DSI Data Science Subspecialty Panel Chair, Chief Radiology Informatics Officer, Mecklenburg Radiology Associates, Charlotte, NC