At its I/O 2022 event, Google announced that its recently introduced multisearch function is soon expanding to include local search results, giving people the opportunity to find nearby product sellers. This is part of the search giant’s conscious efforts to offer people a more natural way to express their needs.  “In the Google app, you can search with images and text at the same time; similar to how you might point at something and ask a friend about it,” wrote Prabhakar Raghavan, a Senior Vice President at Google, announcing the new feature. “Now we’re adding a way to find local information with multisearch, so you can uncover what you need from the millions of local businesses on Google.”

Center of the Universe

Google announced multisearch in April, calling it one of the most significant changes to Search in several years. As Raghavan illustrated at Google I/O, the feature allows you to search for things you can’t easily describe with words, like an unfamiliar part of a leaky faucet. Multisearch leverages Google Lens’ image identification abilities, enabling people to search using a picture and then refine the results by adding context with additional text. For instance, a user could snap a photo of a jacket, then add text to ask Google to find it in a different color. They could then visit the website and immediately purchase the jacket, in the desired color, for instant gratification. The expanded multisearch feature announced at Google I/O 2022 takes the shopping experience offline by allowing you to search for local businesses by adding the words “near me” to the image. As Raghavan explained, the next time you see a dish you want to try but don’t know its name, you can take a photo with Google Lens and find restaurants that serve it in your vicinity. In their announcement, Google explains the multisearch near me feature works its magic by scanning “millions of images and reviews posted on web pages,” then combining it with information in Google Maps to fetch local results. The feature will first be available in English later in 2022 and will eventually be rolled out around the world in other languages as well. 

Shopping Reimagined

The more interesting addition to multisearch announced at Google I/O 2022 is the ability to search within a scene. Demonstrating the feature, dubbed Scene Exploration, Raghavan said it’ll enable people to pan their phones to learn about multiple objects in that broader scene.  Calling it a Ctrl+F (the popular shortcut for the Find command) for the world around you, Raghavan suggested the feature could be used to scan the shelves at a bookstore to bring up relevant insights or to quickly hunt for the best nut-free dark chocolate in a fraction of the time it’d take to find one manually combing through the aisle. From a broader perspective, Google’s multisearch feature not just enhances but also speeds up the online shopping experience. It ties into the broader trend of “contextual shopping,” infusing purchase opportunities into everyday activities and natural environments, enabling people to buy anything, anytime, anywhere.  Yoni Mazor, Chief Growth Officer of GETIDA, a technology solutions company believes Google’s new multisearch features are a step towards delivering an enhanced shopping experience. In an email exchange with Lifewire, Mazor explained that an ideal contextual shopping experience is where people can quickly take a snapshot of a product of any kind (food, garment, shoe, etc.), and the results will be streamlined toward the best shopping option available, factoring in the best price, nearest location, best reviews, and overall experience. “If the basic infrastructure of the technology is laid out now and will get better and refined in the future, there is definitely a place for contextualized shopping to become a dominant way for consumers to shop online,” opined Mazor.