How multisearch will boost the range of what you can do with Google and Lens

·2-min read
It will soon be possible to use the Google Lens application to refine your searches.

At its annual developer conference, Google revealed how multisearch, which consists of combining searches in both text and images, is about to change the habits of mobile users. In concrete terms, it will soon make it possible to broaden the scope of searches on Google and Lens, starting with a simple photo or screenshot.

Currently the Google application already makes it possible to simultaneously search with images and text. Soon it will be possible to find additional, local information about your searches. This means that it will be possible to use any photo and add the query "near me" to display the businesses offering these products. This could be for searches related to shopping for anything from a dress to a decoration item or a restaurant menu item. The idea is obviously to make the search as natural as possible.

To illustrate the scope of these possibilities, Google takes the example of a photo of a dish found on the internet, whose name you don't even know. Multisearch near me will find a restaurant in your vicinity that has this particular dish on its menu. In such cases, Google's engine analyzes millions of images and reviews published by its community of Maps contributors. This type of multiple search will be available later this year in English. For other languages, we'll have to be a little more patient.

But the most impressive thing about these multisearches undoubtedly involves the Lens application. Currently, it's possible to identify a product or a person with this app. In the near future, it will be possible to obtain other types of detailed information, directly on the screen of your smartphone, in augmented reality. What Google calls "scene exploration" will allow users, for example, to identify a product on a store shelf according to their own, specific criteria. Thus, when viewing an aisle of chocolate bars through one's smartphone camera, it will be possible to filter out those containing certain allergenic ingredients or others too low in cocoa, depending on your requirements. Again, this feature will roll out gradually.

Other highlights of the Google I/O conference include glasses that enable instantaneous translation in augmented reality and many updates concerning Maps .

David Bénard

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting