During the COVID-19 pandemic, several clinical decision support systems (CDSS) were developed to assist in patient:inpatient triage. However, there has been little research on the interaction between decision support systems and human experts.
For the study at the Department of Anesthesia, Intensive Care Medicine and Pain Medicine at MedUni Vienna, 32 physicians were recruited who had to assess the survival probability of 59 critically ill patients on the basis of medical records. Subsequently, one of two artificial intelligence systems communicated a calculated survival probability to the physician. However, only one of these systems explained the reasons for its decision. In a third step, the physicians reviewed the medical record again to determine the final probability of survival.
In most cases, the physicians either rejected the AI recommendation or applied a "compromise," but 30 percent followed in the direction of the suggestions by the AI.
The researchers had also hypothesized in advance that an explanatory system would have a greater influence on the second evaluation of the physicians. Previously, it was assumed that explanatory artificial intelligence (i.e. artificial intelligence that "explains" its decision) would be trusted more than so-called "black boxes" (i.e. artificial intelligence that cannot "justify" its decision). However, the team was able to show in its study that decision-making behavior was not influenced by an additional explanation from the artificial intelligence. The decisions of the physicians were also not influenced by fatigue or level of training.
"In fact, artificial intelligence caused a significant influence on the physicians in the simulated Covid triage situation, independent of any explanation that may have existed," Oliver Kimberger explains. "More research is needed at the interface between physicians and artificial intelligence. This field is rapidly gaining importance in the healthcare sector due to rapid advances in machine learning. Newer technologies such as clinical reasoning systems could complement the decision-making process rather than simply presenting unexplained biases."
Oliver Kimberger's research group focuses on data science issues in the field of anesthesia and intensive care medicine as well as the applicability of artificial intelligence and machine learning methods to the healthcare sector. The group frequently works in networks with international partners as well as the Ludwig Boltzmann Institute Digital Health and Patient Safety. Members of the group are medical specialists and residents in anesthesiology and intensive care medicine together with students of doctoral studies, medical informatics and human medicine.
Publication: BMC Medicine
The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
Daniel Laxar, Magdalena Eitenberger, Mathias Maleczek, Alexandra Kaider, Fabian Peter Hammerle & Oliver Kimberger.
BMC Medicine volume 21, Article number: 359 (2023)