ACR's Data Science Institute to develop structured AI framework for rad practices
November 26, 2017
by
John R. Fischer, Senior Reporter
The American College of Radiology (ACR) Data Science Institute (DSI) is set to introduce a framework, strategy and focus for turning artificial intelligence from a concept into an everyday practice in radiology.
ACR DSI will develop AI use cases for radiology using an open source framework to determine standards for training, testing, validating, integrating and monitoring AI algorithms in clinical practice, creating a standardized platform to build such cases for optimizing radiology practices and improving patient care.
“A lot of developers are creating algorithms right now, and I think some of those will find utility in clinical practice and some may not,” Bibb Allen, chief medical officer of the ACR DSI, told HCB News. “I think what developers are missing is structured use cases for AI algorithms in regard to asking doctors ‘what do you need, what do you need in your practices?’”
Development will take place in stages with the first step involving the construction of the open source standard framework which will be available to medical organizations, institutions and developers for building radiology uses cases around AI. Specific AI use cases will be based on standards that are necessary for meeting the needs of the specialty.
A standardized pathway for validating and certifying algorithms will be created to ensure their effectiveness and patient safety, and to assist in expediting FDA regulatory review processes. Plans also include the development of radiology workflow interoperability standards and pathways for incorporating AI algorithms into clinical workflow, as well as ongoing postmarket assessment of algorithm performance and effectiveness through an AI registry.
Allen says that the development of AI uses cases will provide many standards of reference and resources that will assist more than just radiologists.
“ACR DSI use cases will be more than just the good idea for an AI algorithm, but also a framework for data elements for annotation, AI authoring, algorithm testing, training and validation, data elements for clinical integration into transcription software or elsewhere, and data elements for registry input to monitor algorithm performance in clinical practice,” he said. “We believe these tools will be useful to developers and FDA and government regulators, and we have received positive feedback from both groups.”
The organization hopes to have proof of concept use cases available in the first half of next year.