International Journal on Science and Technology

E-ISSN: 2229-7677     Impact Factor: 9.88

A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal

Call for Paper Volume 17 Issue 2 April-June 2026 Submit your research before last 3 days of June to publish your research paper in the issue of April-June.

Explainable AI for Clinical Decision Support: A Study on Interpretable Models for Disease Diagnosis

Author(s) Mr. Ronak Goyal
Country India
Abstract This study explores the impact of explainable artificial intelligence (XAI) components on enhancing Diagnostic Accuracy (DA) within clinical decision support systems (CDSS). Focusing on three key predictors—Model Interpretability (MI), Clinician Trust (CT), and Case Complexity (CM)—the study utilizes primary data collected from 250 healthcare professionals in New York. A structured questionnaire measured all constructs on a 5-point Likert scale. Data analysis was conducted using R Studio, applying multiple regression to assess the relationships among variables. Results indicate that MI, CT, and CM each have a significant and positive effect on DA, with the model explaining approximately 65% of the variance. Visual diagnostics support the model’s validity and confirm compliance with key statistical assumptions. This research highlights the importance of integrating interpretable AI in healthcare to improve clinical outcomes and practitioner confidence. The study offers practical implications for healthcare AI design and future research in medical decision-making systems.
Field Computer Applications
Published In Volume 17, Issue 2, April-June 2026
Published On 2026-04-09

Share this