by
John R. Fischer, Senior Reporter | October 20, 2020
They reason that researchers are more focused on publishing their findings than spending time and resources to ensure they can be replicated. They also say that the researchers who develop AI solutions dictate the terms of sharing and write up the informed consent of patients, raising questions about how well informed patients are about the sharing of data.
“The concern stated about privacy attacks against the learned parameters of a deep learning model could not reveal more than what went into the model, which is a mammogram, and whether radiologists identified that image as containing cancer cells,” said Waldron. “It takes a lot of twisting of the imagination to come up with any scenario where that could affect any patient volunteer.”

Ad Statistics
Times Displayed: 19090
Times Visited: 362 Stay up to date with the latest training to fix, troubleshoot, and maintain your critical care devices. GE HealthCare offers multiple training formats to empower teams and expand knowledge, saving you time and money
He and his colleagues add that third-party validation is essential to ensuring AI solutions are assessed in an unbiased manner. This can be done through containerization (bundling an application together with all of its related configuration files); platforms specifically for sharing AI systems; and cloud platforms with authentication, they say.
The critique was published as an opinion piece, also in
Nature.
Back to HCB News