Over 70 Total Lots Up For Auction at One Location - CA 10/11

Study finds that AI tool can help rule out abnormal pathology on chest X-ray

by Lauren Dubinsky, Senior Reporter | August 22, 2024
Artificial Intelligence X-Ray
When a research team in Denmark used a commercial AI tool off-label, they found that it can help rule out abnormal pathology on chest X-rays and had equal or lower rates of critical misses than radiologists. A study outlining these findings was published yesterday in Radiology.

Due to the global radiologist shortage, increasing demand for imaging studies, and potential for burnout in the field, many are turning to AI to help ease the burden. Since radiology practices have a large amount of chest X-rays with no clinically significant findings, AI could be used to generate automatic reports.

But Dr. Louis Lind Plesner, lead author from Herlev and Gentofte Hospital in Copenhagen, cautioned that AI has to be more sensitive than a radiologist when providing an automated normal report so as to not hinder the standard of care.

"We believe this is because the radiologist incorporates clinical context and knowledge into the reports," he told HCB News. "This is where there is still a lot to be learned for AI."

He added that a radiologist will look with "intensified carefulness" for a lung tumor if the X-ray was ordered due to heavily suspicious cancer symptoms such as coughing blood and smoking history. However, AI will not exhibit this shift in focus and therefore, the radiologist can often be more pragmatic when calling images "normal."

The study included radiology reports and data from 1,961 patients who visited four hospitals in Denmark. Plesner and his team set out to assess the quality of mistakes made by both AI and radiologists and whether the AI mistakes are worse than human mistakes.

The AI tool was modified to generate a "remarkableness" probability for chest X-rays. That was then used to calculate how well the AI tool can correctly identify people without the disease at different AI sensitivities.

Two thoracic radiologists who were blinded to the AI results, were instructed to label the chest X-rays as "remarkable" or "unremarkable" based on predefined unremarkable findings. One radiologist who wasn't told whether the mistakes were made by AI or a radiologist graded the chest X-rays with missed findings as critical, clinically significant or clinically insignificant.

The reference standard labeled 1,231 of the chest X-rays as remarkable and 730 as unremarkable. The AI tool was shown to accurately rule out pathology in 24.5% to 52.7% of unremarkable chest X-rays with a 98% or greater sensitivity.

You Must Be Logged In To Post A Comment