top of page

What will it take to allow AI to replace a physician's diagnostic decision?

Artificial intelligence (AI) has the potential to replace human experts in diagnosing disease from various types of medical imaging. This has been proven in several imaging domains, including radiology (i.e., X-ray, MRI and CT scans), photos of skin lesions, retinal imaging, and more. One study from the University of Birmingham suggests that AI is able to diagnose disease with accuracy similar to that of healthcare professionals.


In radiology AI is mainly used as decision support for human experts, to automatically identify and classify areas of interest such as tumors or blood vessels. This can improve the speed and accuracy of diagnoses and help radiologists identify subtle features that might be missed by the human eye. AI models can also be used to assess the urgency of cases and assist radiologists in prioritizing cases based on the likelihood of disease or malignancy. But where AI falls short is in fully automated image interpretation that replaces the human expert.


AI also has the potential to be a powerful tool in diagnosing skin lesions, which can be challenging for human dermatologists due to their high variability and subtle differences in appearance. AI models can be trained to segment skin lesions from the surrounding tissue, which may be useful in the measurement of lesion size and tracking changes over time. AI algorithms can also be trained to classify skin lesions as either benign or malignant based on their visual characteristics. This can improve the accuracy of diagnoses, especially for difficult-to-identify skin lesions. They can be used to provide a second opinion in diagnostic decisions, which can help to reduce diagnostic errors and improve the accuracy of diagnoses.


But as in the case with AI in radiology, AI in dermatology is still used in a decision support capacity and until its accuracy is validated through large-scale prospective clinical trials, AI is not expected to replace the human expert.

Earlier this year, a company developing AI for diagnosing retinal imaging published a study in the Ophthalmology Times that found its artificial intelligence system to be able to detect diabetic retinopathy with more sensitivity than traditional dilated eye exams. The study involved over 1,000 patients with diabetes and found that the AI system had a sensitivity of 89.5% and a specificity of 91.4% for detecting diabetic retinopathy. This is compared to a sensitivity of 86.6% and a specificity of 85.6% for traditional dilated eye exams performed by human experts.

While this study is clearly performed by an AI company with a very clear position and stake in the matter, it does also clearly suggest that AI can do a better job at some tasks than human experts.

So why is AI still on the sidelines? Why are we not seeing AI replacing doctors in diagnosing medical imaging?


Firstly, there needs to be a very clear distinction between AI that helps a human expert (i.e., decision support) versus AI that makes diagnostic decisions which replace the human expert.For AI that functions as a decision support tool there have been several FDA approvals already, many of them in the field of radiology, and while the technology is truly impressive, the ultimate diagnostic decision remains with the human expert.


For AI that functions as a diagnostic tool, i.e., that replaces the human expert, as of today only AI-based solutions for diagnosing diabetic retinopathy have been cleared by the FDA. These clearances were also not easily obtained; all clearances were based on large-scale prospective studies which proved the AI’s safety and efficacy.


There are several likely reasons why regulators still seem hesitant to let AI take over diagnostic decisions in other fields:


  1. Psychology: Patients would rather a human make a mistake than a software. Similar considerations are likely holding up the wider deployment of autonomous vehicles. People can understand a human error much better than a machine error, even if on average the machine will make fewer errors.

  2. Measurement Standards: Lack of established, objective standards for determining ground truth diagnoses in certain fields make it difficult to design studies to prove the accuracy of AI as a diagnostic tool. Without such studies efficacy and safety are hard to prove, and regulatory authorities are understandably reluctant to allow unproven AI solutions to be used on patients.

  3. Risk: There are many healthcare situations in which a wrong decision could cause harm, or even death. For example, in the emergency room, a patient with intracranial bleeding that is missed (by either human doctors or AI) could die within minutes. But there are other situations where an occasional wrong decision will have only a minor effect. For example, an AI-based diagnostic screening solution for a slowly progressing disease can miss a diagnosis - i.e. issue a false negative diagnosis - every now and then but because the disease is progressing slowly, it will likely catch it in the next screening.

  4. Resistance: When trying to replace experts that have invested many years in developing their expertise - as is the case with radiologists, for instance, - there is often a lot of natural resistance to the concept of an AI replacing the expert.


Regulators seem to understand where technology is headed and are cautiously exploring where AI can be used as a diagnostic tool that replaces a human expert. In recent years the FDA has cleared a handful of AI solutions to autonomously screen diabetic patients for diabetic retinopathy instead of a human expert. This is likely due to the fact that (i) there are established standards for determining the ground truth diagnosis of diabetic retinopathy, which provides an effective way to measure the accuracy of the AI, (ii) the risks associated with the diagnostic decision are relatively small, because disease progression is usually slow, and (iii) there is relatively little resistance from eyecare specialists to allowing AI to screen patients for early signs of the disease, because these specialists are in short supply.


  • https://www.linkedin.com/company/aeyehealth/
  • https://www.facebook.com/aeyehealth/
  • https://twitter.com/aeyehealth
Hero Section2 (2).png
footer2.png

Get the Latest News

Stay informed by signing up with the latest news and insights from Aeye Health

Thanks for submitting!

bottom of page