When people who are blind or visually impaired try to move around a city like Paris, they can easily get lost the minute they exit a subway station.
A pair of young French engineering students are working to tackle this problem by developing an augmented reality (AR) navigation app that identifies the most practical route for them and uses “spatial sound” – also known as 3D sound – to guide them in the right direction.
“We're trying to make something very simple where you get the 3D sound from the correct direction. You turn in the direction that the sound is coming from, and then you're good to go,” SonarVision Co-Founder and CEO Nathan Daix told Euronews Next.
The app is currently under development and testing, but the young start-up aims to make it available in 2023. The prototype works in Paris but can “pretty easily” be deployed to other big European capitals, he said.
There are already apps out there that alert the user of surrounding points of interest – such as Blindsquare and Soundscape – but SonarVision’s added value, he said, is to guide users from point A to B like a “super-high-precision GPS” that’s also highly intuitive.
Mainstream wayfinding apps such as Google Maps and Apple Maps have not been designed to accommodate the needs of people who are visually impaired and are hard to use with screen readers, he said.
“One of the very frustrating problems with a lot of these products is precision,” Daix explained.
“GPS in cities can be in good times around 4 to 5-metre precision. But in the worst moments, which is at least 30 per cent of the time, you get 10 plus-metre imprecision.
“That means that the GPS will tell you you've arrived at your bus stop but you're actually on the other side of the street, and you still need to figure out a way to get to your actual bus stop and you have no clue where it is”.
Scanning buildings and city streets
To address this, SonarVision uses the phone’s camera to scan buildings using AR technology and compares them to Apple’s database of scanned buildings for a given city.
“This allows us to precisely geo-track our users with anywhere from 20 centimetres to one-metre precision,” Daix said, adding that this allowed the app to keep the user on crossings and pavements while avoiding, where possible, stairs and construction areas.
For the tech to work, all the user needs is a headset and an iPhone with a camera pointed at the road – though going forward, the camera could be part of AR glasses for more convenience.
“Your phone would be in your pocket and you'd have the glasses on your face doing the seeing part and the 3D sound coming out of the branches,” Daix said.
No replacement for a white cane
However, the app does not do any real-time obstacle detection and is only designed to be a “complement” to a white cane, a guide dog or other devices that people with a visual impairment use to get around.
The tech to do that does exist, however: LiDAR technology, or light detection and ranging, has the potential to “really help people who are visually impaired,” he said.
“What it allows to do is scan depth in the environment. We’ve actually started working with LiDAR on an iPhone 12 Pro and have been able to develop a prototype that basically replaces the white cane.
“It allows us to detect obstacles, but not just obstacles on the floor – obstacles that are at head level, at body level... It's really powerful stuff”.
The main reason SonarVision is not focusing on using LiDAR just yet, Daix said, is that the smartphones that feature it (like the iPhone 12 Pro) are expensive and SonarVision aims to make its tech as accessible as possible.
“Today the most important feature that we can work on is wayfinding – accurate and affordable wayfinding is really one of the biggest problems that has not yet been resolved for people with visual impairment,” he said.