Artificial Intelligence (AI) has emerged as a transformative force in healthcare, reshaping diagnostics and revolutionizing patient care. Its applications span from refining alerts to assisting in intricate areas like image processing and disease diagnostics. In an effort to standardize terminology related to AI in healthcare, the American Medical Association (AMA) defined three categories for AI application: assistive, augmentative, and autonomous.
Assistive vs Augmentative vs Autonomous
In the realm of Assistive AI, the focus lies on the machine's capability to alert or highlight clinically relevant data without generating conclusions. Conclusions are drawn by the clinician. This approach does not diminish the physician's role; instead, the machine serves as a vital aid, spotlighting pertinent data for the healthcare professional's attention. The physician's responsibilities encompass analyzing the identified data, determining its clinical significance, and subsequently reporting their interpretation of the findings. In this category, the AI acts as a decision support tool, emphasizing the collaborative synergy between advanced technology and clinicians.
An example of assistive AI is Viz LVO, developed by VIZ.ai, an AI-powered notification tool for hospital networks and clinicians, focusing on CT angiogram images of the brain. Viz identifies potential large vessel occlusions (LVO) and alerts neurovascular specialists for parallel review alongside standard care. Notifications, including compressed images, are sent through a mobile app for informational purposes only. The primary role of Viz LVO is to assist clinicians by highlighting cases that may require their attention, facilitating early detection and intervention for suspected LVO. The final diagnostic decisions rest with the clinicians, emphasizing the collaborative synergy between the assistive AI and the clinicians.
Augmentative AI involves providing clinically meaningful analyses and quantification of data without immediate interpretation. The machine's role is to augment the data and present analyses that may have clinical significance. Clinicians can use the analyses to better interpret the data. Similar to the assistive category, the clinician is the one making the diagnostic decision..
Aidoc's BriefCase serves as an example, assisting doctors in quickly identifying potential cases of three or more acute rib fractures in chest CT scans. The software utilizes an artificial intelligence algorithm to analyze images and provides notifications for prioritization. While BriefCase aids in managing cases efficiently, it complements, rather than replaces the examinations by medical professionals. Its computer-aided image analysis aids in triage by notifying clinicians in parallel with standard care, allowing healthcare professionals to make the final interpretation.
Another example for augmentative AI is Sight Diagnostic’s Sight OLO, an automated hematology analyzer using computer imaging and vision algorithms to quantify Complete Blood Count parameters. It produces clinically relevant outputs, contributing to the diagnostic process by offering specific measurements related to blood parameters. In line with augmentative AI characteristics, healthcare professionals utilize these outputs to make more informed decisions.
Autonomous AI is when the machine automatically interprets data and independently generates clinical meaningful conclusions without the involvement of the clinician. Clinically meaningful conclusions can be a determination with respect to the likelihood of pathophysiology or even a diagnostic decision. beThere are three levels of autonomous AI medical services and procedures:
Level I: The autonomous AI draws conclusions and offers diagnosis and/or management options, which are contestable and require clinician action to implement.
Level II: The autonomous AI draws conclusions and initiates diagnosis and/or management options with alert/opportunity for override, which may require the clinician to implement..
Level III: The autonomous AI draws conclusions and initiates management, which require clinician initiative to contest.
AEYE Health's AI technology serves as an exemplary instance of Level III autonomous AI. Specifically, the technology autonomously diagnoses diabetic retinopathy based on retinal imaging, replacing specialist screening and generating instant diagnostic reports. The AI offers accurate and consistent results. This allows primary care providers to screen for diabetic retinopathy even though they are not trained retina specialists. Instead, they rely on the AI to perform the diagnosis. AI-based screening for diabetic retinopathy represents one of the first and few FDA-cleared AI technologies that replace human readers, marking a significant milestone in the evolution of diagnostic capabilities.
Summary and Future Outlook
The integration of Artificial Intelligence (AI) into healthcare has ushered in transformative changes, reshaping diagnostics, and advancing patient care. Defining the role of AI is crucial, and categorizing it into assistive, augmentative, or autonomous forms provides guidance on optimal utilization. While Assistive and Augmentative AI have been prevalent, the emergence of autonomous AI marks a new frontier, holding the potential to alleviate healthcare burdens and enhance patient outcomes. Looking ahead, the future of AI diagnostics is promising. Continued evolution and acceptance of these technologies offer the prospect of substantial improvements in efficiency, accuracy, and overall patient care. The collaborative synergy between advanced technology and healthcare professionals remains fundamental, ensuring that AI serves as a valuable tool in the hands of skilled medical experts. Ongoing advancements in AI are poised to usher in a new era of enhanced medical practices and improved patient outcomes.