Study explores how explainable artificial intelligence affects clinician trust in medical imaging

Onur Asan, associate professor at Stevens Institute of Technology
Onur Asan, associate professor at Stevens Institute of Technology
0Comments

In recent years, artificial intelligence (AI) has become an important tool for analyzing medical images. Advances in computing and access to large datasets have enabled AI systems to help doctors interpret X-rays, MRIs, and CT scans more efficiently. These tools can assist clinicians in diagnosing and treating serious diseases such as cancer.

“AI systems can process thousands of images quickly and provide predictions much faster than human reviewers,” said Onur Asan, associate professor at Stevens Institute of Technology. “Unlike humans, AI does not get tired or lose focus over time.”

Despite these advantages, many clinicians remain cautious about using AI due to the so-called “black box” problem—uncertainty about how AI arrives at its conclusions. “When clinicians don’t know how AI generates its predictions, they are less likely to trust it,” Asan explained. He and his colleagues set out to examine whether providing additional explanations would improve clinicians’ trust in AI and impact diagnostic accuracy.

Asan collaborated with PhD student Olya Rezaeian and Assistant Professor Alparslan Emrah Bayrak from Lehigh University on a study involving 28 oncologists and radiologists who used an AI system to analyze breast cancer images. The participants received varying levels of explanation for the AI’s assessments before answering questions about their confidence in the results and task difficulty.

The research found that while AI improved diagnostic accuracy compared to a control group, offering more detailed explanations did not necessarily increase trust among clinicians. “We found that more explainability doesn’t equal more trust,” said Asan. Providing extra or complex information increased cognitive workload for users, sometimes slowing decision-making and reducing performance.

“Processing more information adds more cognitive workload to clinicians. It also makes them more likely to make mistakes and possibly harm the patient,” Asan stated. He emphasized that adding tasks should not create unnecessary burdens for users.

Another finding was that some clinicians placed too much confidence in the AI system, which could lead them to overlook important details on medical images—a potential risk for patient safety. “If an AI system is not designed well and makes some errors while users have high confidence in it, some clinicians may develop a blind trust believing that whatever the AI is suggesting is true, and not scrutinize the results enough,” Asan noted.

The team published their findings in two studies: one appeared in Applied Ergonomics on November 1 under the title “The impact of AI explanations on clinicians’ trust and diagnostic accuracy in breast cancer,” while another was published August 7 in the International Journal of Human–Computer Interaction as “Explainability and AI Confidence in Clinical Decision Support Systems: Effects on Trust, Diagnostic Performance, and Cognitive Load in Breast Cancer Care.”

Asan believes that thoughtful design is essential when integrating explanations into clinical AI tools. “Our findings suggest that designers should exercise caution when building explanations into the AI systems,” he said. Proper training will also be necessary so that human oversight remains part of clinical practice: “Clinicians who use AI should receive training that emphasizes interpreting the AI outputs and not just trusting it.”

He concluded by highlighting two key factors influencing technology adoption among doctors: perceived usefulness and ease of use. “Research finds that there are two main parameters for a person to use any form of technology — perceived usefulness and perceived ease of use,” he said. “So if doctors will think that this tool is useful for doing their job, and it’s easy to use, they are going to use it.”



Leave a Reply

Your email address will not be published. Required fields are marked *

Related

Sean M. Spiller President

NJEA Hipp Foundation awards $100,954 in new grants for 2026-27 school year

The NJEA Frederick L. Hipp Foundation has awarded nearly $101K across New Jersey schools for innovative educator-led projects during 2026-27. Funding includes support from Visions Federal Credit Union with a focus on social justice initiatives.

George M. Cook, Performing the Duties of the Director

U.S. Census Bureau releases March 2026 business formation statistics

The U.S. Census Bureau has published its Business Formation Statistics for March 2026. The report tracks new business applications across the United States and Puerto Rico.

Ron S. Jarmin, Deputy Director and Chief Operating Officer at U.S Census Bureau

Census Bureau releases new 2025 U.S. population estimates by age and sex

The U.S. Census Bureau has released new national population estimates by age and sex for July 1, 2025. Additional demographic data releases covering housing units and more detailed breakdowns are scheduled in upcoming months.

Trending

The Weekly Newsletter

Sign-up for the Weekly Newsletter from North Jersey Business Daily.