However, this was not the case. The researchers say: “On this occasion, the artificial intelligence candidate was unable to pass any of the 10 mock examinations when marked against similarly strict criteria to its human counterparts, but it could pass two of the mock examinations if special dispensation was made by the RCR to exclude images that it had not been trained on.”
These are observational findings and the researchers acknowledge that they evaluated only one AI tool and used mock exams that were not timed or supervised, so radiologists may not have felt as much pressure to do their best as one would in a real exam.
Nevertheless, this study is one of the more comprehensive cross comparisons between radiologists and artificial intelligence, providing a broad range of scores and results for analysis.
Further training and revision are strongly recommended, they add, particularly for cases the artificial intelligence considers “non-interpretable,” such as abdominal radiographs and those of the axial skeleton.
AI may facilitate workflows, but human input is still crucial, argue researchers in a linked editorial.
They acknowledge that using artificial intelligence “has untapped potential to further facilitate efficiency and diagnostic accuracy to meet an array of healthcare demands” but say doing so appropriately “implies educating physicians and the public better about the limitations of artificial intelligence and making these more transparent.”
The research in this subject is buzzing, they add, and this study highlights that one foundational aspect of radiology practice—passing the FRCR examination necessary for the licence to practise—still benefits from the human touch.
Back to HCB News