The Buzz AI Is Beating Doctors Is Trendy But False

In January, the journal Nature published an article about an AI model that can match or outperform radiologists at detecting breast cancer. The article has been widely covered in the press, adding to the many studies demonstrating the potential benefits of AI in medical imaging. Developed by Google’s health research unit, the model was trained on thousands of images from both the U.K. and U.S. In both cohorts, AI found cancers initially missed by radiologists, and it reduced false positives for patients with no cancer. At the same time, there were some cancers found by radiologists that were not identified by AI. Of note, the study did not include all available modalities, including digital breast tomosynthesis.

The authors acknowledge that the real-world testing will be needed to ensure the model can be generalized to routine practice. In an editorial about the paper, Etta Pisano, chief research officer at the ACR, wrote, "The real world is more complicated and potentially more diverse than the type of controlled research environment reported in this study."

Impressive studies like this one seem to add one more brick to the house some are building around the idea that AI is outperforming doctors — and that radiologists will be among the first to be replaced. For example, in JACR this August, Maciej Mazurowski argued that the need for radiologists could be greatly diminished in the not-too-distant future. Breaking down several arguments supporting the conclusion that AI will not fundamentally change the radiologist’s role, Mazurowski explains why he believes that a significant disruption of the radiology workforce is a real possibility.

So what should we believe? In an article published in JACR earlier this month, we took a closer look some of the claims we hear repeatedly that suggest AI disruption of the field of radiology is imminent, and we made some more realistic predictions about the future as AI in healthcare improves.

Claim #1: Algorithms are highly accurate, making autonomous radiological care by machines mere moments away

These claims are most frequently made by those less familiar with imaging care than radiologists and other physicians. Mazurowski is not unique in arguing that AI will be superior to radiologists in imaging interpretation. His article further suggests radiologists’ claims that AI cannot replace them are human-centered reasoning attributable to our fundamental appreciation of “the brilliance of the human perceptual and deductive apparatus.”

We may in fact overestimate our own brilliance, but that’s not what is behind radiologists asserting AI shouldn’t provide autonomous care. Most radiologists are enthusiastic about algorithms that are able to detect subtle finding radiologists might miss. These narrow AI tasks are ideal uses of AI and can potentially benefit patient care. And we are not expecting perfection. Radiologists themselves do not perform identically on every task, and we would not expect it of AI. The combination of AI recognizing abnormalities not caught by radiologists and radiologists catching abnormalities not detected by AI is a powerful one with great potential for patient care.

The fundamental flaw in asserting that AI will replace radiologists is based on the belief that AI algorithms can be rapidly developed for the thousands of tasks radiologists perform. While AI has been shown to augment the care radiologists provide, AI does not yet have the capacity to do what radiologists accomplish every day. We read about the things AI can do; we don’t read about the things it cannot do — because that’s not news. How long will it take for developers to create all of the algorithms necessary for things like fracture classification and every post-traumatic abnormality in every bone and joint? No one knows, but don’t look for it any time soon.

We must also recognize that there are no established economic models for implementing AI in health care — a topic that has been covered in the media and is a subject of earlier DSI blogs.

Add the fact that relatively few AI models in healthcare are available for clinical use today, the question of who will pay for AI, and the unlikelihood that humans will trust their medical care to a machine any time soon, then the claim that radiologists will soon be replaced begins to show its flaws. Yes, some narrow algorithms are highly accurate, but that doesn’t mean they can or will substitute for a radiologist.

Claim #2: Getting AI results will be like ordering lab tests

Look closely at the examples offered by Mazurowski and others who explain how AI can substitute for radiologists, and you’ll see that radiological interpretation is portrayed as a binary activity with a yes/no output. That’s true of lab work, but image interpretation and characterization of radiological findings are rarely binary.

Let’s look at pneumonia as an example. The question for a radiologist interpreting an image is not simply, “Is there pneumonia or not?” For an algorithm to be useful, it would need to go beyond “pneumonia or no pneumonia” and tell us whether there was expansion or atelectasis, whether there was a pneumothorax, whether there was a pleural effusion, or whether there was an empyema, lung abscess, or neither. Nothing we have seen to date points to an algorithm with the ability to look at all of these factors and reach a conclusion.

Just think of all of the subtle conditions where ruling out is what leads us to the ultimate conclusion on a condition. A normal imaging study is the sum of negative inferences for all diagnostic possibilities, not just a “no” for one specific question or use case, such as fracture detection. Try to imagine the documentation and potential outcomes that would be required to provide an interpretation in complex cancer follow up. That’s not to say it will ultimately prove impossible, but the path to achieving that goal is difficult and will require tremendous effort and assessment before moving into clinical workflow — making a short-term decrease in demand for radiologists improbable. When you see examples of AI in an article and are led to believe they could substitute for a physician or radiologist, take a hard look at what task the AI is performing and ask yourself whether or not interpretation is required.

Claim #3: The FDA has already begun clearing algorithms, so AI that provides unsupervised automated analysis will soon be brought to market

To date, none of the 40 or so FDA-cleared AI algorithms cleared for radiology provide any level of autonomous care. The most common form these algorithms take is triaging work lists based on potential critical findings of the AI.

Mazurowski’s example of an algorithm that assists primary care physicians in the detection of diabetic retinopathy implies there is a trend in the FDA to approve algorithms for autonomous care. Yet, when the algorithm referred to gives a positive result, the primary care physician does not initiate treatment but rather directs the patient to an ophthalmologist for further care.

Unless there are significant changes in the regulatory process, autonomous care by software as a medical device would almost undoubtedly require Class III classification with full FDA approval. This would require multicenter trials, the cost of which would be astronomical when amplified by the thousands of algorithms that would require approval. Not only would this be cost prohibitive for small AI developers, it would put tremendous pressure on FDA resources.

If instead, the FDA continues in its current general direction, which is classifying software as a medical device (including AI algorithms) under Class II rather than Class III, it seems they view physicians as remaining between the AI inferences and patients. Thus, the path to moving AI into clinical workflow is much shorter. Keeping physicians in the process accelerates the deployment of AI tools into our practices. This could change, of course, but to suggest an imminent move toward autonomous AI care is overstated. Although AI developers might ultimately be willing to assume all liability for their software tools, requiring AI developers to maintain additional liability coverage would further increase barriers to AI development and significantly decelerate AI development and implementation.

Where does that leave us?


While predicting future workforce needs is always challenging, dire predictions for the specialty shouldn’t go unchallenged. As a technologically savvy group, the future seems incredibly bright for our specialty. Those who believe there will be dramatic workforce reductions seem to suggest it will happen all at once. These claims can be misleading. Changes to the radiology profession from AI are inevitable, but they will be gradual.



Bibb Allen, Jr., MD, FACR | ACR DSI Chief Medical Officer | Diagnostic Radiologist, Grandview Medical Center, Birmingham, AL

The Buzz AI Is Beating Doctors Is Trendy But False

  • You may also like

    SIIM-ACR Data Science Summit 2023: Exploring the Value of AI to Harness the Future

    As radiologists, we strive to deliver high-quality images for interpretation while maintaining patient safety, and to deliver accurate, concise reports that will inform patient care. We have improved image quality with advances in technology and attention to optimizing protocols. We have made a stronger commitment to patient safety, comfort, and satisfaction with research, communication, and education about contrast and radiation issues. But when it comes to radiology reports, little has changed over the past century.