Over 100 California Auctions End Tomorrow 12/02 - Bid Now

UK and US researchers develop AI models for evaluating emotional burden of cancer

by John R. Fischer, Senior Reporter | January 03, 2019
Artificial Intelligence Rad Oncology
Researchers from the Centre for Vision, Speech and Signal Processing (CVSSP) at the University of Surrey have collaborated with the University of California in San Francisco (UCSF) to develop two new AI models designed to predict symptoms of cancer and their severity throughout the course of treatment for a patient.

Considered to be a first-of-its-kind study, the evaluation has found that both are capable of accurately determining the severity of depression, anxiety and sleep disturbance – all common symptoms associated with cancer – enabling them to more clearly assess reductions in quality of life for patients.

"The work focuses on helping doctors to predict and manage side effects of cancer treatments. This finding is exceptionally significant for those cancer patients that experience multiple co-occurring symptoms which are severe and extremely distressing," Nikolaos Papachristou, one of the designers of the algorithms, told HCB News. "Identifying these high-risk patients prior to the start of their treatment can help clinicians to provide aggressive symptom management interventions so that the deleterious outcomes can be avoided."

Cancer patients experience a large variety of symptoms during the course of treatment, with some incurring few and others experiencing a heavy load. Depression occurs in up to 60 percent of patients while anxiety affects between 35 and 53 percent. Both are experienced by 45 percent, and linked to both are complaints of sleep disturbance in 30 to 50 percent.

Along with quality of life, the three are linked to decreases in ability to function on a daily basis, though evidence shows that treatment of any of these issues can prevent the development of the other two.

Deploying the algorithms during the course of CT treatment for patients, the authors analyzed existing data on symptoms from different time periods to validate if the algorithms could accurately predict when and if symptoms arose. They found that actual reported symptoms correlated closely with the predictions made by both models.

The next step, according to Papachristou, is to collect enough data to integrate within one solution for assessing the severity of these symptoms, and possibly more, as well as those of other chronic medical conditions.

"We are planning to integrate all our work in a tool to make it accessible and reusable by other researchers. However, this will also require more data for training and tuning the algorithms," he said.

Another recent AI system developed at the University of Massachusetts relies on relatively small data sets to classify different forms of intracranial hemorrhages, making it a possible tool for identifying patients with symptoms of life-threatening strokes. In addition, Context Vision recently launched, at the 2018 RSNA annual conference, an AI-powered image enhancement solution for digital radiography with different exposure levels and the ability to create more consistent looks between patients and different types of anatomy.

The findings of the Surrey and UCSF study were published in the journal, Plos One.

You Must Be Logged In To Post A Comment