A new study offers hope for people who are blind or have low vision (pBLV) through an innovative navigation system that was tested using virtual reality. The system, which combines vibrational and sound feedback, aims to help users navigate complex real-world environments more safely and effectively.
The research from NYU Tandon School of Engineering, published in JMIR Rehabilitation and Assistive Technology, advances work from John-Ross Rizzo, Maurizio Porfiri and colleagues toward developing a first-of-its-kind wearable system to help pBLV navigate their surroundings independently.
“Traditional mobility aids have key limitations that we want to overcome,” said Fabiana Sofia Ricci, the paper’s lead author and a Ph.D. candidate in NYU Tandon Department of Biomedical Engineering (BME) and NYU Tandon’s Center for Urban Science + Progress (CUSP). “White canes only detect objects through contact and miss obstacles outside their range, while guide dogs require extensive training and are costly. As a result, only 2 to 8 percent of visually impaired Americans use either aid.”
In this study, the research team miniaturized the earlier haptic feedback of its backpack-based system into a discreet belt equipped with 10 precision vibration motors. The belt's electronic components, including a custom circuit board and microcontroller, fit into a simple waist bag, a crucial step toward making the technology practical for real-world use.
The system provides two types of sensory feedback: vibrations through the belt indicate obstacle location and proximity, while audio beeps through a headset become more frequent as users approach obstacles in their path.
"We want to reach a point where the technology we’re building is light, largely unseen and has all the necessary performance required for efficient and safe navigation," said Rizzo, who is an associate professor in NYU Tandon’s BME department, associate director of NYU WIRELESS, affiliated faculty at CUSP and associate professor in the Department of Rehabilitation Medicine at NYU Grossman School of Medicine.
"The goal is something you can wear with any type of clothing, so people are not bothered in any way by the technology."
The researchers tested the technology by recruiting 72 participants with normal vision, who wore Meta Quest 2 VR headsets and haptic feedback belts while walking around NYU's Media Commons at 370 Jay Street in Downtown Brooklyn, an empty room with only side curtains.
Through their headsets, the participants experienced a virtual subway station as someone with advanced glaucoma would see it - with reduced peripheral vision, blurred details, and altered color perception. The environment, created with Unity gaming software to match the room's exact dimensions, allowed the team to determine how well participants could navigate using the belt's vibrations and audio feedback when their vision was impaired.
"We worked with mobility specialists and NYU Langone ophthalmologists to design the VR simulation to accurately recreate advanced glaucoma symptoms," says Porfiri, the paper’s senior author, CUSP Director and an Institute Professor in NYU Tandon’s Departments of BME and Mechanical and Aerospace Engineering. "Within this environment, we included common transit challenges that visually impaired people face daily - broken elevators, construction zones, pedestrian traffic, and unexpected obstacles."
Results showed that haptic feedback significantly reduced collisions with obstacles, while audio cues helped users move more smoothly through space. Future studies will involve individuals with actual vision loss.
The technology complements the functionality of Commute Booster, a mobile app being developed by a Rizzo-led team to provide pBLV navigation guidance inside subway stations. Commute Booster “reads” station signage and tells users where to go, while the haptic belt could help those users avoid obstacles along the way.
In December 2023, the National Science Foundation (NSF) awarded Rizzo, Porfiri, and a team of NYU colleagues a $5 million grant via its Convergence Accelerator, a program whose mission includes supporting the development of assistive and rehabilitative technologies. That grant, along with others from NSF, funded this research and also supports Commute Booster’s development. In addition to Ricci, Rizzo and Porfiri, Lorenzo Liguori and Eduardo Palermo are the paper’s authors, both from the Department of Mechanical and Aerospace Engineering of Sapienza University of Rome, Italy.
Journal
JMIR Rehabilitation and Assistive Technologies
DOI
Method of Research
Experimental study
Subject of Research
People
Article Title
Navigation Training for Persons With Visual Disability Through Multisensory Assistive Technology: Mixed Methods Experimental Study
Article Publication Date
18-Nov-2024