Posted on

by

in

Google Lens resurrects this feature from 2015

Google just pulled the wraps off of some new and interesting Search, Lens, and Maps features in Paris. One of the more notable features comes to the company’s AI-powered camera app. Google Lens will, eventually, be able to search what’s on your screen, according to TechRadar.
So, Google Lens didn’t fare too well as a standalone app, but the search giant switched up its strategy. The company implemented Lens into different corners of its ecosystem of services. Now, it’s implemented into the Google Search bar and Google Photos, and you can automatically scan screenshots. This powerful AI tool has only grown in power since its unveiling back in 2017.
Google lens will search what’s on your screen
The company shows us a quick demonstration of this feature in action. If shows a phone with a video playing. The user then accesses Google Assistant and taps on a Lens icon that pops up. Lens then scanned the screen and delivered search results on what it scanned. Basically, when you tap that button, the phone will take a screenshot and scan that.

In the coming months, we’re introducing a ✨major update ✨ to help you search what’s on your mobile screen.
You’ll soon be able to use Lens through Assistant to search what you see in photos or videos across websites and apps on Android. #googlelivefromparis pic.twitter.com/UePB421wRY
— Google Europe (@googleeurope) February 8, 2023

In the video, the user held down the power button to access the Assistant. When this feature launches, we’re not sure what voice command we’ll need to use. It might be something like “Hey Google, what’s on my screen?” or “Hey Google, search my screen.”
We saw something like this back in 2015
This seems like a nifty feature, but folks in the tech world back in 2015 might remember something like this. Back before the days of Google Assistant, we had Google Now On Tap. This feature was introduced with Android 6.0 Marshmallow.
This feature worked in much the same way as the new Lens feature. You’d hold the home button to access Google Now. The software would then analyze what’s on your screen and give you contextual information on what’s on your display. If you’re in a text message conversation, it will scan the text and look for any way that it could help out. Say, if you’re scheduling dinner, it would give you search results for nearby restaurants. The list goes on.
The Lens implementation is a bit different, as you can imagine, and it’s much more sophisticated. The company has seven more years of AI and software development under its belt. Hopefully, this implementation won’t suffer the same fate as Google Now On Tap.
The post Google Lens resurrects this feature from 2015 appeared first on Android Headlines.

Source: ndroidheadlines.com