New technology out of the University of Waterloo is looking to help visually impaired people better interact with touchscreen devices.
Developed by a team of engineering students at the University of Waterloo, WatVision is a system that uses text to speech technology to enable individuals to access the touchscreen functions on everyday devices and gadgets.
“As [a touchscreen] changes screens, the functions will change and if you can’t see the screen, it’s just completely impossible to use,” Lior Lustgarten, a member of the WatVision team told CBC.
Users are asked to take a photo of the touchscreen with the WatVision app, which then deciphers where exactly there is text on the image. The app takes the identified text and reads it out loud to the user as they move their finger around the screen.
The technology’s hardware includes either a tracking glove or ring, which tells the system where a person’s finger is on the touchscreen.
“As you explore the screen, the app will read out to you what text you’re hovering over,” Lustgarten said.
Individuals who are visually impaired already use text to speech technology, but that technology only works on devices that already have the necessary code built into them. Lustgarten says this often isn’t the case.
“It’s clear that most manufacturers just aren’t considering accessibility when they make things,” he said.
Aside from just being able to complete day to day tasks, independence and autonomy is a major concern for many of the people that the team consulted with for the app.
While much of the other assisted technology that already exist for the visually impaired relies on virtual assistants, WatVision is intended to be used independently.
“It’s more of a tool than a device. It assists someone rather than doing it for them,” Lustgarten said.
After winning this year’s James Dyson Award, he says the team plans to use the money, in part at least, to keep developing it, and they hope to collaborate with other developers to further the technology.