Editions   North America | Europe | Magazine

WhatTheyThink

RIT Faculty Earns NIH Grant to Use Virtual Reality to Help Stroke Patients Regain Lost Vision

Press release from the issuing company

Eye-tracking technology will help patients exercise the lost portions of their field of vision

Associate professor Gabriel Diaz, right, received a grant from the National Institutes of Health to use virtual reality to help patients with cortically-induced blindness restore portions of their vision.

Scientists from Rochester Institute of Technology and the University of Rochester aim to use virtual reality to help restore vision for people with stroke-induced blindness. The team of researchers led by Gabriel Diaz, associate professor at RIT’s Chester F. Carlson Center for Imaging Science, received a grant from the National Institutes of Health to develop a method they believe could revolutionize rehabilitation for patients with cortically-induced blindness. The condition afflicts about 1% of the population over age 50.

While there are well-established therapies to help stroke patients regain their motor functions, there are no standardized rehabilitation strategies to restore lost vision. Over the past 10 years, Krystel Huxlin, the James V. Aquavella, M.D. Professor in Ophthalmology at UR’s Flaum Eye Institute, has found that these patients can regain portions of their vision through targeted exercises that force them to use the blind portions of their visual field. Huxlin and Diaz believe virtual reality could be a key to helping this form of treatment take the next step.

“The goal of this work is to build upon Dr. Huxlin’s methodologies in a few ways,” said Diaz, the principal investigator of the grant. “One major limitation of the current methodology is that people cannot train as effectively at home if they’re not under the supervision of a researcher who’s using an eye tracker. Eye trackers are an emerging technology in the field of virtual reality and we’re going to develop a training apparatus that could be used at home.”

Built-in eye trackers in virtual reality headsets will allow patients to ensure they are fixated on the correct spot and doing the exercises properly. Huxlin said that the eye-tracking technology plus cost savings will make the exercises much more effective at home for patients.

“We can do it in the lab right now with a desktop and a desktop-mounted eye tracker, but that’s a $40,000 device on top of the computer itself,” said Huxlin, a consultant on the project, and close collaborator of Diaz. “So, it’s completely impractical to deploy into a patient’s home. It’s too expensive and even if you were able to afford it, it’s almost impossible to self-calibrate and use the system on yourself. You have to have another person do the calibration on you.”

Virtual reality also has the added benefit of making the exercises multisensory, adding audio cues in addition to visual stimuli. Research suggests that the audio and visual systems are interconnected, so the team will see if sound cues can help accelerate visual rehabilitation. Ross Maddox, assistant professor in UR’s Departments of Biomedical Engineering and Neuroscience, is providing expert consultation in this realm.

Preliminary testing is already underway. Catherine Fromm, an RIT imaging science Ph.D. student, has been conducting research on the virtual reality components in Diaz’s Perception for Movement (PerForM) Lab. Diaz will also seek out undergraduate students to help modify eye tracking equipment for the project.

For more information, visit the NIH website.