A radiologist's gaze pattern during
examination of a mammogram

Eye movement could cause errors in mammogram interpretation, study finds

June 25, 2018
by John R. Fischer, Senior Reporter
Interpretation is in the eye of the radiologist – and with it contextual bias, the presence of which can make all the difference in whether a scan is read accurately or not.

Researchers at the Department of Energy’s Oak Ridge National Laboratory in Tennessee are hoping to reduce errors derived from this influence through the development of an AI-powered solution based on their study of the roles played by eye movement and cognitive processes in mammogram interpretation.

“There's a dearth of research on the effect of the subjective nature of visual perception and cognitive judgment, both of which are essential in medical image interpretation. Contextual bias examines whether a radiologist's visual search pattern and diagnostic decision for a specific case may be influenced by the radiologist's judgments for prior cases,” Gina Tourassi, team lead and Director of ORNL's Health Data Sciences Institute, told HCB News. “Our study confirmed that such influence indeed exists, although its magnitude differs across radiologists. This allows us to detect any systemic patterns of visual behavior which correlate with diagnostic error.”

As the second leading cause of death in women, early detection of breast cancer is essential and places more pressure on clinicians to accurately diagnose its presence. Misinterpretations enable malignancies to grow and spread, creating greater risks to patient health, including death.

Equipping three board-certified radiologists and seven residents with head-mounted, eye tracking devices, researchers recorded raw gaze data by evaluating the eye movement of each while reading 400 images from 100 studies derived from University of South Florida’s Digital Database for Screening Mammography.

Recordings were also made of clinician diagnostic decisions related to the location of suspicious findings and their characteristics with the BI-RADS lexicon system serving as a basis.

Utilizing a series of statistical calculations and a fractal dimension of each participant’s scan path to differentiate eye movement from one exam to the next, researchers found context bias from previous diagnostic experience played a significant role in the interpretations of each, with trainees the most susceptible to it, and experienced users displaying some degree themselves.

The findings were then assessed by deep learning models trained by the researchers on ORNL’s Titan supercomputer to comprehend large datasets, processing the full data sequence to reveal differentiations in the eye paths of participants.

Tourassi warns that the use of AI tools is not meant to replace radiologists but produce second points-of-reference that are free of contextual bias, thereby reducing the occurrence of errors associated with it. She adds that though promising, the results of the study require testing on larger sets of data.

“Our initial findings were based on a limited imaging dataset. The next step is to expand our experiment to other clinical sites with a larger imaging dataset and more radiologist participation. This larger experiment will confirm the robustness of our initial study findings. Further, we will proceed with developing a dynamically adaptive decision support system which integrates imaging and radiologists' gaze data to reduce the risk of human error in medical image diagnosis.”

Imagery reflected a range of cases typically found in clinical settings, consisting of positive and negative diagnoses as well as cases that mimicked the signs of cancer but were declared benign. No prior knowledge of the findings for each was disclosed to participants. Deviation in the context of different image categories was also calculated.

Though tested for mammographic interpretation, the same experimental design and algorithm can be applied to diagnostic interpretations involving different imaging modalities and diseases.

Research was supported by ORNL’s Laboratory Directed Research and Development Program with the work relying on the resources of the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility.

The findings were published in the Journal of Medical Imaging.