by John R. Fischer
, Senior Reporter | April 25, 2019
Baylor College of Medicine’s Translational Research Institute for Space Health has awarded a grant to VisualDx in exchange for its integrated clinical decision support tool for ultrasound imaging during deep space flights.
The aim is to supply astronauts with an intuitive and easy-to-use system that can explain how to perform ultrasound scans when necessary for self-directed medical care when Earth-based telemedical physicians are delayed or unavailable.
"A radio signal from Mars could take as long as 20 minutes to reach earth. An acute symptom might need immediate evaluation," Dr. Art Papier, CEO of VisualDx, told HCB News. "The idea is to bring help to people when there is no bandwidth."
Special-Pricing Available on Medical Displays, Patient Monitors, Recorders, Printers, Media, Ultrasound Machines, and Cameras.This includes Top Brands such as SONY, BARCO, NDS, NEC, LG, EDAN, EIZO, ELO, FSN, PANASONIC, MITSUBISHI, OLYMPUS, & WIDE.
VisualDx is designed around symptoms, and designs contextual questionnaires around these symptoms to provide diagnostic possibilities and next steps. The platform can be customized so disease answers fit a demographic.
The company will create a stand-alone version of the VisualDx platform that can be used without being connected to the internet. It will be adapted for non-physician users, equipped with user-guidance capabilities, and reduce the amount of time necessary for training by providing basic instruction in ultrasound image interpretation and differential building for non-specialists.
VisualDx is used in over 2,300 hospitals and large clinics globally, including all VA hospitals and in a number of military hospitals. It expanded
its agreement with the Department of Veterans Affairs in November by offering clinical staff at more than 1,200 VA hospitals access to its VisualDx add-on feature, DermExpert, to improve diagnostic accuracy in skin-related conditions.
"AI, machine learning and sensors are improving rapidly," said Papier. "We will see enhanced imaging of ultrasound images, eye, oral, and skin images. We, as a company, are investing heavily in visual light machine learning of the skin and mucosa."
The company plans to release an early prototype in six months and will continue to iterate and improve the clinical decision support system over the next year or two.
The amount of the TRISH grant was not disclosed.