Site icon COOL BLIND TECH

Project Guideline uses AI to help partially sighted runners navigate

CBT Logo

CBT Logo

Google, with collaboration with nonprofit organization Guiding Eyes for the Blind, has piloted an AI system called Project Guideline, designed to help blind and partially sighted people run races independently.

According to the U.S. Centers for Disease Control and Prevention, in 2015, a total of 1.02 million people in the U.S. were blind and approximately 3.22 million people are visually impaired. Technologies exist to help blind and low-vision people navigate challenging everyday environments, but those who wish to run must either rely on a guide animal or a human guide who’s tethered to them.

How does Project Guideline work?
Using an app on an Android smartphone that trackeds the virtual race via GPS and a Google-designed harness that delivered audio prompts to indicate the location of a painted line.

Users wear an Android phone around the waist using the aforementioned harness; the Guideline app runs a machine learning model that looks for the painted line and identifies it. (The model, which emerged from a Google hackathon, accounts for variables in weather and lighting conditions.) Then, the app approximates the user’s position and delivers audio feedback via bone-conducting headphones to help keep them on the guideline. If the user is to the left of the line, they’ll hear audio in their left ear increase in volume and dissonance, and if the user moves to the right of the line, the same will happen in the their right ear.

Is the app using the internet to navigate?
Google’s Guideline app works without an internet connection and requires only a guideline painted on a pedestrian path.

Are there any other plans for Guideline after the project?
Beyond the pilot with Guiding Eyes for the Blind CEO Thomas Panek, Google plans to partner with organizations to help paint guidelines in different communities and provide additional feedback.

The launch of Guideline comes after Google debuted more in-depth spoken directions for Maps, which inform users when to turn and tell them when they’re approaching an intersection so they can exercise caution when crossing. The company also continues to develop Lookout, an accessibility-focused app that can identify packaged foods using computer vision, scan documents to make it easier to review letters and mail, and more.

Source

Exit mobile version