How augmented reality gives one doctor surgical 'superpowers'

by Lisa Chamoff, Contributing Reporter | April 09, 2018
Operating Room X-Ray
From the April 2018 issue of HealthCare Business News magazine

One of the patients has a particularly large pituitary adenoma. As they view the patient’s MRI and CT scans on a screen in the dark conference room, Dr. Kalmon Post, former chairman of the neurosurgery department at Mount Sinai, and the current program director for the neurosurgery residency program, mentions that he recalls hearing at a recent conference about a similar case, in which two surgeons operated on the patient at the same time from different angles. Unfortunately, there ended up being no advantage to the unusual method.

Bederson does have an edge in the large tumor resection scheduled for the afternoon — the newest segmented reality software from Brainlab, coupled with a heads-up display linked to the Zeiss microscope, along with the Surgical Theater augmented reality software.

Before making his way to the OR suite, Bederson stops in the neuro intensive care unit (ICU) to visit three patients who had undergone surgery earlier in the week. On the way, Bederson motions to a vacant building that can be glimpsed out the window of the hospital skyway, which will be the eventual new home of the neuro ICU.

The current neuro ICU is experiencing an overflow problem, and patients also need to be transported to the basement for all their scans, which adds to the time of the exams. The new suite will conveniently have scan rooms built in.

Avoiding ‘no-fly zones’
Bederson’s first surgery of the day, a transnasal endoscopic tumor resection, goes smoothly, with the bulk of the procedure completed within an hour using the Surgical Theater augmented reality technology and Brainlab navigation.

A procedure at Mount Sinai using the Surgical Theater
augmented reality technology and Brainlab navigation.
For the second, more complex, surgery, Bederson comes in after the patient is sedated to consult on her positioning based on the heads-up display projection of the tumor into the microscope.

After part of the tumor is removed via a transnasal endoscopic procedure, Bederson comes in about two hours later to start his part of the operation, a sublabial microscopic and endoscopic approach in which he accesses the tumor through the patient’s upper lip.

As Bederson begins the procedure, he observes the power of integrating Brainlab software into the Zeiss operating microscope, bringing GPS-like navigation that tracks the microscope’s focus point relative to the patient's anatomy and then projects an augmented reality view into the heads-up display in his vision of the surgical field.

You Must Be Logged In To Post A Comment