Voice Access is an unreleased application, created by the Google accessibility team, that helps people with mobility issues use their voice to control their Android device. Because it is Beta software, it might not be entirely stable at this point. You can download the app here. Below is a description from the developer.
The FCC is requiring all manufactures and wireless providers to make Real Time Text available without the purchase of any additional software or hardware. Real Time Text allows the receiver of a text to see the message as it is being type. Normally a person would not see the message until the send button is pressed. This type of messaging makes for a more natural form of communication between the hearing and deaf community. It will also allow 911 operators to see a message even if it gets cut off before the send button is pressed. Real Time Text has been around for a long time, but it is only now that it is a mandatory feature for all wireless devices.
You can read the entire FCC decision here.
Dr. Toby Rush, a music theory instructor at the University of Dayton in Ohio, created Braille Music Notator, which is a free online tool for creating music scores. It is designed for both sighted and visually impaired composers. You can use traditional visual symbols that are then translated into braille symbols, or use braille symbols that are translated into visual symbols. This makes it possible for sighted and blind musicians to work together and create music that can be read by both. If the music is created in braille, it can be read with a braille display, or printed on a braille embosser.
Click here to access the tools and find out more about creating braille music.
285 Million people in the world are visually impaired, and 55 Million of them live in India. Microsoft’s focus area in this effort is machine learning. Microsoft will deploy the Cortana Intelligence Suite for advanced analytics and to build Artificial Intelligence models on eyecare.
To be able to come up with machine learning predictive models for vision impairment and eye disease, Microsoft has collaborated with L V Prasad Eye Institute, Bascom Palmer, University of Miami, Flaum Eye Institute, University of Rochester, Federal University of Sao Paulo, and Brien Holden Vision Institute in Australia.
The data gathered will be used to identify and predict potential eye conditions, and to recommend treatments.
To learn more about Microsoft’s initiative, watch this video about MINE.
GOOGLE Maps has introduced a cool new feature recently in their latest update making easier for those with accessibility needs.
The maps application is now wheelchair friendly, thanks to a team of Google employees who worked on the update in their spare time in a bid to make the world more accessible.
The Google accessibility team has released TalkBack 5.1 for devices running Lollypop, Marshmallow, and Nougat. TalkBack 5.0.7 has been released for devices running Jelly Bean and KitKat.
There are no new features for TalkBack 5.0.7. Here is a list of the changes to TalkBack 5.1 from the eyes-free Google group.
Uber is launching a new feature starting on New Year’s Eve. Drivers in four cities will be given glowing beacons that attach to the windshield. You can select the color that you wish for the beacon to project in the Uber app. This will make it easier to pick out which vehicle is your ride. The program is intended to help the visually impaired and people who can’t distinguish their ride in a crowd of cars. Uber is also adding more detail to the vehicle description in the app. You will now be provided with the color of the car that is coming for you in addition to the license number, model, and the name and picture of your Uber driver.
The beacon program will start in Miami FL, Denver CO, Nashville TN, and Newcastle England. More cities will be added in 2017.
I have some exciting news, but first, “Let me take a selfie!
Google accessibility has launched a major improvement to the Google camera app for people using Talkback on a Google Pixel or Google Nexus device running Nougat 7.1 or higher. Talkback will announce the number of faces and their position within the frame. It will also let you know what percentage of the frame is occupied by the face or faces. This way you can tell when the focus is centered and how close the camera is before snapping that all important shot. It works on both the front and rear facing cameras. I tested it on my Google Pixel, running Nougat 7.1.1, and it worked extremely well. If you have access to one of these devices, I highly suggest that you try it. It is a lot of fun!
Yesterday, Apple released TVOS 10.1 for the fourth-generation Apple TV, and with it came the much-anticipated TV app. Unfortunately, the TV app isn’t ready for release. The number of satellite and cable subscription services that are integrated with the TV app is almost non-existent, and although Hulu is integrated with the app, Netflix is not.
Even if you have no interest in the TV app, you will notice that when you press the home button on your remote, it will launch the TV app at the point where you were last watching. If you press it again, the home button will bring up the home screen. If you want your home button to bring up the home screen without first launching the TV app, go to settings; then remotes and devices; and then select home button to change its function back to what it was before the update.
I am enjoying that I can use Siri to play my favorite shows on Hulu, both on the Apple TV and on my iPhone, but my cable provider and Netflix are not yet integrated; therefore, it isn’t a one stop place for video content yet.
With the addition of Actions, which allows third parties to integrate their applications with the Google Assistant, be prepared to see a host of apps being integrated with Google Home and the Google Pixel. The Nest thermostat is now integrated with the Google Pixel, and Netflix and Google Photos are integrated with Google Home. Through a Chrome Cast, you can control Netflix and Photos on your television by just using your voice to tell Google Home what to watch or what picture to display. Open your Google Home app on your smart phone, navigate to your Google Home Assistant settings, and select video and photos. If you don’t see that section, it will roll out to you later.