Site icon COOL BLIND TECH

Ask Chloe to Read the Label on Your Prescription Bottles

CBT Logo

CBT Logo

A recent collaboration between the AT&T Foundry for Connected Health (located in the Texas Medical Center Innovation Institute) and Aira, which makes smart glasses for people who are blind or have low vision, now has a solution to reading the tiny print on prescription bottles.

How it works
Aira’s remote assistive technology connects smart glasses called “explorers”—with a network of certified agents via an augmented reality dashboard. The agents serve as visual interpreters, helping users accomplish a wide range of activities.

While developing an artificial intelligence (AI) and machine learning system to read the labels on medication bottles, the AT&T Foundry team partnered with Aira about a year ago to provide network connectivity to Aira’s smart glasses. The result is “Hey Chloe,” a recognition solution with built-in, voice-activated technology that debuted in March 2018. Aira’s new AI platform identifies prescriptions and over-the-counter medications.

Nadia Morris, former director of the AT&T Foundry, explained the process: “First, [the computer] has to determine if it is a medication bottle or not,” she said. “It’s similar to the TxTag, where a photo is taken. Their systems are trained to know what a license plate looks like.”

The TxTag system, which allows drivers to pre-pay tolls, works off an AI system that recognizes license plates. Morris’s team applied the same process to medication bottles; team members brought in their own bottles and trained the computer to read them. The team even set up a secure system for other AT&T colleagues to donate images of their bottles to help train the computer.

“We crowdsourced it,” Morris said. “A lot of employees run the spectrum of age, gender and ethnic background, so it was a good cross section.”

Users can activate the AI assistant by asking, “Hey Chloe, what medication is this?” The AI assistant will scan the field around the user and find the bottle of prescription medication. The glasses will read the label and turn that information into an audio file that is read into the user’s ear. The system also works for over-the-counter medication.

The AT&T Foundry team learned a few things during this project. One challenge with machine learning is providing a varied data set from which the computer can learn. In addition, because most pill bottles are cylinders, the user often must rotate the bottles for the glasses to read the prescriptions correctly.

Team members also discovered that a large number of prescriptions come from independent pharmacies, so they had to train the computer to recognize different types of labels.

During the project, Morris was often asked why the team didn’t use photos from the Internet, where such images are readily available.

She said it was because those pictures are typically perfect, with the label always facing the right way.

“People who are visually impaired might not always pick up the bottle with the label facing them,” she said.

Aira’s Horizon smart glasses, come with “Hey Chloe” and became available in May 2018. The glasses are already paired with an Aira-dedicated smartphone, powered by AT&T, for those who don’t own a smartphone.

“There is a lot of synergy between the work Aira is doing to connect blind users and human agents, and what AT&T is accomplishing to power that connectivity,” explained Greg Stilson, director of product management at Aira.

“AT&T has been a huge partner with us,” said Stilson, who is blind. “It stemmed from the need to have a partner who provided data. Imagine a constant video feed with a blind user connected to agents managing all of that data, with the ‘explorer’ using up that data on their own smartphone plan.”

“More than 35 percent of the interactions between agents and users involve some level of reading, which is why “Hey Chloe” provides such an advantage, he added.”

The artificial intelligence platform also helps users locate pill bottles that have been misplaced. Users scan an area with the glasses and ask the AI agent to locate the bottle of medication. In turn, the glasses recognize the medicine label among other items and direct the user to the bottle.

“AT&T is helping Aira add new interactive abilities to “Hey Chloe” so the AI program will be able to recognize other items and even tell the user to move closer to an object if it is blurry,” Stilson said.

“The overall goal is for people to have freedom and interaction. “It’s like having a sighted person in your pocket,”” he said.

“We have this beautiful AI and human interaction,” Stilson said. “The pill bottle is one thing, but we are moving toward being able to read any text out there. Imagine going through an airport, one of the most challenging environments, where you have to go from Point A to Point B, passing restaurants and restrooms. Soon, all you will have to say is ‘Chloe, read this.’ Text reading is opening up the world of print, and we are very excited about it.”

Exit mobile version