Affiliations 

  • 1 School of Electrical & Electronic Engineering, Engineering Campus, Universiti Sains Malaysia (USM), Malaysia
  • 2 School of Electrical & Electronic Engineering, Engineering Campus, Universiti Sains Malaysia (USM), Malaysia. Electronic address: [email protected]
  • 3 Department of Radiology, School of Medical Sciences, Health Campus, Universiti Sains Malaysia (USM), Kelantan, Malaysia
Int J Med Inform, 2024 Nov 04;193:105689.
PMID: 39522406 DOI: 10.1016/j.ijmedinf.2024.105689

Abstract

OBJECTIVE: Explainable Artificial Intelligence (XAI) is increasingly recognized as a crucial tool in cancer care, with significant potential to enhance diagnosis, prognosis, and treatment planning. However, the holistic integration of XAI across all stages of cancer care remains underexplored. This review addresses this gap by systematically evaluating the role of XAI in these critical areas, identifying key challenges and emerging trends.

MATERIALS AND METHODS: Following the PRISMA guidelines, a comprehensive literature search was conducted across Scopus and Web of Science, focusing on publications from January 2020 to May 2024. After rigorous screening and quality assessment, 69 studies were selected for in-depth analysis.

RESULTS: The review identified critical gaps in the application of XAI within cancer care, notably the exclusion of clinicians in 83% of studies, which raises concerns about real-world applicability and may lead to explanations that are technically sound but clinically irrelevant. Additionally, 87% of studies lacked rigorous evaluation of XAI explanations, compromising their reliability in clinical practice. The dominance of post-hoc visual methods like SHAP, LIME and Grad-CAM reflects a trend toward explanations that may be inherently flawed due to specific input perturbations and simplifying assumptions. The lack of formal evaluation metrics and standardization constrains broader XAI adoption in clinical settings, creating a disconnect between AI development and clinical integration. Moreover, translating XAI insights into actionable clinical decisions remains challenging due to the absence of clear guidelines for integrating these tools into clinical workflows.

CONCLUSION: This review highlights the need for greater clinician involvement, standardized XAI evaluation metrics, clinician-centric interfaces, context-aware XAI systems, and frameworks for integrating XAI into clinical workflows for informed clinical decision-making and improved outcomes in cancer care.

* Title and MeSH Headings from MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine.