Why Google Lens beats its Apple rival and is ‘definitely’ coming to smart glasses

Google Lens is that rare beast – an exciting Google innovation that, rather than exploding onto the scene before quietly fizzling out (see Killed by Google), has steadily grown into a quietly useful tool that should definitely be part of your phone ninja skillset.

Not familiar with its powers? The simplest definition of Google Lens is that’s a search engine for the real world – rather than typing your query into a box, you use your phone’s camera to scan an object, building or scene, and Lens will use image recognition tech to tell you more about it.

But it also does a lot more than that – and as we discovered in a fascinating chat with Google Lens guru Lou Wang (official job title: Director of Product Management), it’s only just getting started. This is good news because a familiar arrival, Apple, just pitched up right next to it.

At WWDC 2021 this week, Apple announced two new iOS 15 features – ‘Live Text’ and ‘Visual Look Up’ – that are effectively its version of Google Lens. A common refrain with Apple, true or not, is that it arrives fashionably late to technologies with refined versions of ideas that have been test-driven by someone else. 

But is that the case with Google Lens, and what does Google think of Apple’s strangely familiar take on visual search? More importantly, when are we going to see Lens make the leap to smart glasses? Here’s what Google’s Lou Wang told us in a chat that weaved through next-gen walking tours, the privacy concerns of visual search, and unusual uses for Google Lens in bars.

Fast learner

Google Lens was launched back in 2017, but has its roots in an older (and now retired) app called Google Goggles. Four years is a long time in tech and Google Lens’ powers have, quietly but steadily, grown since Lou Wang started working on the project at its inception.

“When we first started, we were very simple. For example, we could read text from the physical world. But we’ve come a really long way in the time between then and now,” he told us. What’s fueled that growth? “It’s based on a few things. One is just machine learning and AI, which is something that Sundar [Pichai, Google CEO] talks about a lot. Even our ability to have hardware that can actually process this information has continued to grow in leaps and bounds,” he added.

“When we launched, we said ‘we can understand millions of objects’. And after a year and a half, we were at ‘oh, now we can understand a billion objects’. And then two years from there we were at 15 billion,” he said. “The usage that we’ve seen from lens has grown from essentially zero to now about 3 billion times a month, and it’s continuing to grow.”

That’s a lot of people, considering that holding up a phone camera to search the real world still isn’t something that comes naturally to most of us. The lack of any real rivals to Google Lens has helped, of course, so what does Google think of Apple’s new take on visual search?

Rajesh

Rajesh

Leave a Reply

Your email address will not be published. Required fields are marked *