Site icon COOL BLIND TECH

Google’s New Gestures Controls Our Screen Without Touching the Screen

CBT Logo

CBT Logo

Google, in its next phone, wants to introduce the next big thing: a way to control our screens without touching the screen.

Google’s gesture technology is a glimpse of a touchless future. When Google’s next flagship smartphone, the Pixel 4, arrives this fall, it will respond to a series of gesture interactions—a pinch of the fingers, or a wave of the hand—without the user ever needing to touch the screen. Taken together, Google calls these controls “Motion Sense.” A teaser video shows a woman unlocking her new Pixel with a blink, then waving her hand to cycle through a series of songs playing on her phone.

When the Pixel 4 comes out, it will only have a few gesture controls: snoozing alarms, skipping songs, silencing phone calls. But by the time Pixel owners get used to pinching their fingers together and rotating their thumb on invisible dials, a seismic shift will already be underway. Gesture technology will further turn our devices into extensions of ourselves; we move our fingers, and the feedback shows up on a screen. That type of interaction won’t end with phones. One day, we might control every screen with a flick of the wrist.

Google’s gesture technology is merely a glimpse of a touchless future. Just like the iPhone taught millions of people to interact with their world by tapping and swiping, the Pixel may train us on a new kind of interaction, changing how we expect to interact with all of our devices going forward.

Ivan Poupyrev, the technical projects lead at Google’s Advanced Technology and Projects division, has been working toward that future for years. Five years ago, Poupyrev founded Project Soli, a skunkworks lab at Google to invent better gesture controls using miniaturized radar technology. The team developed a chip that’s smaller than the size of a nickel and studded with sensors. Those sensors emit electromagnetic waves that pick up on motion, then translate it into data. Google says the technology works better than 3D cameras to capture fine motions, like pinching together two fingers to “press” a button on the screen.

In January, the Federal Communications Commission gave the technology its nod of approval, noting that the Soli chip “will serve the public interest by providing for innovative device control features using touchless hand gesture technology.” The Pixel 4, expected to land in October, will be its commercial debut.

But Google’s vision goes beyond just consumer convenience. As Brandon Barbello, a product manager for Pixel, put it in a blog post, the initial capabilities of Motion Sense “are just the start.” Poupyrev has already worked on efforts to create connected textiles, like a denim jacket that lets you answer a phone call with a swipe of the sleeve. And the Soli chip could just as easily slide into the crown of a smartwatch, or the visor of a VR headset, or the dashboard of a smart car.

If Google gets it right, the Pixel 4 won’t just be a popular smartphone. It will be the start of a new kind of interaction with our devices.

Exit mobile version