Have you ever come across an object and wondered about its origins? Perhaps while browsing in an antique shop, walking around your neighborhood, or sitting in your grandmother’s living room. If so, maybe there was someone that you could ask to satiate your curiosity. Someone who knew about the object’s history. Where it came from. How old it was. What purpose it served. But maybe there wasn’t. Maybe your inquiry went unanswered. Leaving you to wonder about the object of your affection.
Well, in the near future you may not have to worry about leaving the fate of an object to chance. Instead, thanks to Google you’ll be able to point your phone’s camera at it, in the native camera app even, and learn all about it. All thanks to Google Lens.
According to Wired:
“When Google first announced Google Lens last year, it was described as a kind of search in reverse. Rather than type a text query to find image results, you could point your phone’s camera at an object, like a dog or a plant, to find text-based information. Lens was not only a statement about your camera as an input device but also a most Google-y expression of technology: It combined search, computer vision, AI, and AR, and put it all in apps that weren’t limited to one ecosystem.”
The article adds that, “The new features, which roll out at the end of May, represent Google’s next steps to make your smartphone camera ‘like a visual browser for the world around you,’ says Aparna Chennapragada, vice president of product for AR, VR, and vision-based products at Google. ‘By now people have the muscle memory for taking pictures of all sorts of things—not just sunsets and selfies but the parking lot where you parked, business cards, books to read,’ Chennapragada says. ‘That’s a massive behavior shift.’
A behavior shift that Google is now hoping to take advantage of with Lens as they further re-imagine what mobile search can and should be.
Is Google Lens the Greatest Idea Ever?
Leave a comment