Google appears to be looking to add a slew of new gestures to Wear OS, following sightings of a new patent awarded to the company by USPTO. Driven by an optical sensor and apparent underlying AI vision algorithms, the features would rely on gestures made by the user’s hands. Specifically, that’s the hand attached to the wrist that’s currently wearing the smartwatch.
There are three gestures in total described by the patent via associated images. Those are a hold, release, and tap gesture. The implication is that users would be able to interact with just about every watch functionality without ever touching the screen.
Both of the first two gestures appear to be related. With the newly-patented technology, users can clench their fist to access a „hold” gesture. Then, they can release that grip to instantiate a „release” gesture. The tap gesture, if Google ever gets around to using the new technology, starts with an open palm. Then users tap their thumb together with their fingers before opening their hand again. That instigates a „tap” gesture.
Google is no stranger to gestures, even outside of Wear OS
Now, this new patent follows on a long history of added gestures and improvements for Google, including some in Wear OS. The latest features appear to most closely resemble the Google Pixel’s Project Soli features. Namely, it relies on sensors to visually detect minute hand and finger motions. Only found on the Pixel 4 and 4 XL handsets in some regions, the feature allows for everything from music controls to device waking and more.
The key difference here is the use of a radar-based chipset for Project Soli, while the new patent is reportedly an optical sensor. But Google has been doing gestures for a long time on Wear OS too. In fact, the company added gesture support based on built-in hardware fairly early on. The most recent addition to that was functionality improvements to those in Wear OS 2.1.
Those early features relied mostly on sensing the motion of the wrist and the watch itself. The movements were tracked to allow simple navigation and selections but didn’t allow for deeper navigation into sub-menus once an option was selected.
With the new gestures, Google could break Wear OS away from that drawback. Summarily, it could allow for deep navigation and control across apps and well beyond the main system menu. These new gestures would arguably be the most powerful when combined with those previous wrist-turning gestures — chiefly used for scrolling.
This is meant to be user-friendly across every wearable user
One of the key factors in this new patent is the fact that these gestures will work regardless of how the watch is worn. That’s going to prove useful for a number of reasons. Not least of all, Google has still not released its own wearable. And not every Wear OS smartwatch wears the same.
The gestures are shown to work here regardless of whether the watch is worn higher or lower on the wrist. They’ll also work if the watch is positioned on the inside of the wrist rather than the outside. That would, as long as an optical sensor is included by the OEM, make the features universally available.
Of course, there is a chance Google will go the Pixel route too and keep the feature exclusive. The search giant bought Fossil in early 2019. That was followed first by rumors and then a confirmation that it would buy Fitbit later in the year. The speculation around those purchases is that Google plans to compete directly in the hardware space.
Google’s latest wearable patent points to a new set optical sensor hand gestures
The post Google Patent Showcases AI Vision-Based Wear OS Gestures appeared first on Android Headlines.

Source: ndroidheadlines.com