Augmented reality seems cool, unless you are blind. How can blind people function in an environment that is not there, and is only based on visual perception? Introducing Ultrahaptics, a technology that uses ultrasound to create perception based on touch, not vision. The French company, Immersion, is using Ultrahaptics in combination with Microsoft’s HoloLens to create an augmented reality with objects that are touchable. It is believed that it will be initially used in medical situations to reduce the unnecessary touching of equipment, to control kitchen appliances without spreading food born bacteria, and in automobiles, to allow drivers to access controls without having to reach for them and take their eyes off the road. Obviously, anyone with any sort of an imagination can think of countless ways in which this technology can be applied, but part of the restrictions will most likely be based on cost.
The app provides users a simple way to discover the benefits of Android accessible apps.
It’s called Osmo – a program that blends the physical and digital worlds.
Voice Access is an unreleased application, created by the Google accessibility team, that helps people with mobility issues use their voice to control their Android device. Because it is Beta software, it might not be entirely stable at this point. You can download the app here. Below is a description from the developer.
The FCC is requiring all manufactures and wireless providers to make Real Time Text available without the purchase of any additional software or hardware. Real Time Text allows the receiver of a text to see the message as it is being type. Normally a person would not see the message until the send button is pressed. This type of messaging makes for a more natural form of communication between the hearing and deaf community. It will also allow 911 operators to see a message even if it gets cut off before the send button is pressed. Real Time Text has been around for a long time, but it is only now that it is a mandatory feature for all wireless devices.
You can read the entire FCC decision here.
Dr. Toby Rush, a music theory instructor at the University of Dayton in Ohio, created Braille Music Notator, which is a free online tool for creating music scores. It is designed for both sighted and visually impaired composers. You can use traditional visual symbols that are then translated into braille symbols, or use braille symbols that are translated into visual symbols. This makes it possible for sighted and blind musicians to work together and create music that can be read by both. If the music is created in braille, it can be read with a braille display, or printed on a braille embosser.
Click here to access the tools and find out more about creating braille music.
285 Million people in the world are visually impaired, and 55 Million of them live in India. Microsoft’s focus area in this effort is machine learning. Microsoft will deploy the Cortana Intelligence Suite for advanced analytics and to build Artificial Intelligence models on eyecare.
To be able to come up with machine learning predictive models for vision impairment and eye disease, Microsoft has collaborated with L V Prasad Eye Institute, Bascom Palmer, University of Miami, Flaum Eye Institute, University of Rochester, Federal University of Sao Paulo, and Brien Holden Vision Institute in Australia.
The data gathered will be used to identify and predict potential eye conditions, and to recommend treatments.
To learn more about Microsoft’s initiative, watch this video about MINE.
GOOGLE Maps has introduced a cool new feature recently in their latest update making easier for those with accessibility needs.
The maps application is now wheelchair friendly, thanks to a team of Google employees who worked on the update in their spare time in a bid to make the world more accessible.
The CBT Team are back with their Cool Picks of 2016. Featuring James, Jessica, Joel, Leo, Alex and Nelson with a introduction to our newest team member, Jessica Silva from Cisco Academy for the Vision Impaired.