Google’s impressive multisearch tool will be available in a lot more languages

Google’s impressive multisearch tool, which lets you search using both an image and some text, will be expanding to more than 70 global languages in the next few months, the company announced at its Search On event on Wednesday.

Multisearch uses Google Lens to make it easier to search for things that might be tricky to dig into with just text. Let’s say you see a jacket you like but want to find it in a different color. With multisearch, you can open up Google Lens in Google’s Android or iOS apps, snap a pic of the jacket, type out the color you want to find the jacket in, and search. By making multisearch available in many more languages, a lot more people will be able to use it; the tool initially rolled out in April in a US-only beta, and it’s currently available globally in English.

However, there is a new multisearch feature on the way that will be available first in the US. At this year’s Google I/O, the company previewed what it calls “multisearch near me,” which lets you find things locally. That could be useful if you’re looking for a certain food dish that might be available at a restaurant nearby, for example. At Search On, Google announced that this feature will be coming to the US sometime this fall.

Google showed off a few other handy-looking search features as well. Beginning Wednesday, Google’s iOS app will begin showing shortcuts under the search bar to surface powerful things search can already do, like translate text with your camera or allow you to hum to search for a song. (These shortcuts will be coming to the Android app “soon,” Cathy Edwards, Google’s VP and GM of search, said in a press briefing.)

The company is also introducing some new tools to help you discover more about a particular topic. For example, when you start typing something in the search box, Google is building a feature that will suggest keywords and topic options you can click on to help fill out your query.

That might sound like autocomplete, but it seems like it will be a bit different in practice. You can get an idea of how it works in the GIF below. And Google can also show you information like the weather right under these suggestions. This feature will be launching in English in the US on mobile in the coming months.

And in the actual results, Google plans to show things in a more visual way. In one example shown to press, instead of just a list of links about Oaxaca, Google showed info boxes with things like the weather and a video taken in the city. These changes will be available in English in the coming months in the US, also on mobile.