A team at Schepens Eye Research Institute of Mass Eye and Ear, led by Associate Professor Gang Luo, has been focusing on vision assistive technology for over a decade, running research studies on technology development, intervention, evaluation, and human factors in mobility for people who are blind or low vision. While transit agencies have a mandate to improve accessibility to public transportation as part of the Americans with Disability Act, opportunities exist to improve existing technologies and further remove barriers. Developing a cost-effective tool was paramount for the team in their aim to make bus stops more accessible and easily identifiable to all.
In their effort, they have developed and released a free app called All Aboard which prototypes 10 bus transit services across the US, Canada, UK and Germany.
How does the app work?
To use the app, a user needs to hold their mobile phone in upright orientation in proximity to the stop. The service will make a sonar-like sound to indicate it’s searching for the bus stop sign, followed by a beeping sound to indicate the bus stop was identified. The latter has different levels of pitch roughly representing various distances as demonstrated in this video tutorial.
How does the app recognize bus stop signs?
The All Aboard app used deep neural networks to recognize bus stop signs, with the assumption the user is aware of the bus route they wish to take and is in proximity of the bus stop. By using object recognition, it can correctly identify bus signs which have the same design for a particular transit, while ignoring the exact route number on the signs. For each bus transit, around 5,000 to 10,000 bus stop sign images were collected, labelled, and used to train the neural network to automatically learn the features of the signage patterns. Consequentially, the neural network is capable of differentiating the bus stop signs from other objects and other types of road signs in images. For the recognition neural network to run in real time on a mobile device with lower computational power, a lightweight neural network was created, allowing processing on a mobile device.
What has been the feedback from users?
Since its release in December 2021, the app has been used by more than 130 users in over 1,500 instances across the US, Canada, Germany and the UK. The preliminary results are encouraging; based on the team’s research, the rate of successful navigation of main stream navigation apps averages at 60%, while All Aboard had a 95% detection of bus stops signs. The main stream apps failed mostly in urban area with many high rises. Even for successful navigation instances, All Aboard was able to lead users to bus stops more precisely.
What is next up for the app?
Next up for the app is a plan to expand coverage of bus transit services to more cities in the US, as well as make it available on Android, and build new futures for navigation such as subways, and other popular destinations.