Site icon COOL BLIND TECH

Google Releases New Android Accessibility Features

CBT Logo

CBT Logo

Google rolled out a slate of upgrades for Android designed to make the operating system more comfortable for those with disabilities to use. Google widely released Action Blocks, which combine pre-set Google Assistant commands and shortcuts into a single button or voice command, and added new features to the Voice Access, Live Transcribe, and Sound Amplify apps.

ACTION BLOCKS

Android Action Blocks take any tasks that Google Assistant can carry out, chosen from a list or entered by hand, and brings them together into a single button on the device’s home screen. Users can put a custom image on the button as a way to help those with cognitive disabilities remember what it does. Google started testing Action Blocks in October, but they are now an option on any Android device.

Action Blocks shares many traits with Siri Shortcuts, except it’s been out for longer. It has extra features like Parameters, which prompts the voice assistant to ask follow-up questions and the ability to automate depending on circumstances. The feature also resembles the new simplified interface for Nest Hub Max smart displays that Google has started experimenting with a pilot program. Google gave 1,000 Nest Hub Max devices to Washington State retirement communities to try out the new format. The smart displays offer faster access to a contact list for video calls with Google Duo, as well as digital notecards that both explain how to use the smart display and act as shortcut buttons.

Live Transcribe

Live Transcribe, which simply turns speech into text, has some new elements as well. You can now manually add words that Google might not know, like names and technical jargon. The transcripts saved on the device by Live Transcribe are now searchable by keyword as well. Most notably, users can set the app so that it responds to their name by vibrating. People with hearing difficulty could then know when someone is calling out their name even if they can’t hear them. Relatedly, those who use Sound Amplify on Android to enhance audio around them can now use the feature with Bluetooth, as opposed to only wired headphones.

Google launched the DIVA initiative at I/O last year to improve accessibility to its technology, and the new features and updates are born out of that effort. It’s also the motivation behind projects like voice cues in Google Maps directions to help the visually impaired, and Project Euphonia, which works to train voice assistants to understand those with speech impairments.

Google isn’t alone in trying to make voice and AI tech more accessible, of course. Amazon added a barcode reading feature to Echo Show devices to help those with vision impairment, which evolved from Alexa’s Show and Tell feature. Voice technology’s potential to improve the lives of people with disabilities is still in its early stages, but there’s reason to hope it will be a part of the conversation in how technology is used.

Exit mobile version