Google Is Working On New Upgrades For Its Core Search App Where People Can Annotate The Real World

While Meta is busy developing and promoting its metaverse as the next best thing in the digital world, we’re not seeing search engine giant Google take a back seat.

Yes, the metaverse could well be the future’s next big discovery engine but Google has plenty of major plans of its own.

We’ve got the news that Google is currently working on a number of new generations for its core search app that will enable users to annotate the real world, at any given point and place in time.

The company’s technical staff recently unveiled the news while speaking at a keynote presentation that took place at the Cannes Lions.

Image: Orkhan Kerimov / Twitter
For nearly 24 years, we’ve been seeing Google be at the head of the game in terms of arranging the world’s information into one space. With time, we’ve seen them increase in volume as well as the types of information on offer.

Think along the lines of nearly 25 million, all the way up to billions. And they’re still going strong. Let’s not forget how the company receives search queries each day that they’ve never seen in the past. And that has enabled so many people to seek answers to the unknown because such questions haven’t ever been addressed.

Google’s Pandu Nayak revealed how they’re working on a new and soon-to-be-launched search query version that has received the title of ‘Multisearch’. They hope this can combat the company’s ‘any way’ vision and help users better get their results.

To help make the world better understand its plans, the search engine giant highlighted its ‘hum to search’ feature at the Cannes Summit with some assistance from his daughter.

Nayak revealed how he struggles with producing tunes and therefore showcased his daughter producing a tune that was later seen in real-time. Moreover, all it took was a few minutes of the hum for Google to realize that it belonged to the famous hit called Ocean Eyes by leading singer Billie Eilish.

The song was launched across the Cannes Palais system and came accompanied by a series of live musicians who happened to be a part of those in attendance.
 


To many people’s surprise, Nayak claims the functionality is being used more than 100 million times on a monthly basis and it’s going to be a core feature of the app’s multi-search too.

Google has been working long and hard to get this feature out as soon as possible and will see a launch in several parts of the world, adding various languages and cultures to the dynamics too.

For a while now, Google has been very keen on innovation and that’s exactly what Nayak managed to demonstrate with the combo of the Google Lens search technique that uses visuals with the new multi-search text query. The end result is great product discovery.

Next in line, Google demonstrated a new multi-search functionality called ‘Search Scene’ that allows users to go as far as scanning any real-life display such as a library shelf and then annotating every visual piece present in it. And this includes all the goods that could potentially be on sale at the time and bought by users interested.

In April, Google called this innovative feature of theirs to be the biggest update and most important one they’ve seen in their search history over the years. After all, who lets you click an image and then ask a question relating to that simultaneously?

Lastly, Google spoke about a new feature called ‘multi-search near me’ which they hope can assist users to get what they might be in search of from local sellers nearby. The launch was confirmed to take place around the world by this year and will first be featured in English. Soon after that, the company plans to update with other languages too.

Read next: Google Chrome’s Latest Update On iOS Comes With An Array Of Exciting Additional Features
Previous Post Next Post