- Google Search Live is now available worldwide in 200 countries and 98 languages
- Search Live uses the new Gemini 3.1 Flash Live voice and audio model to enable “more natural” conversational search
- Audio responses contain links to the information source
Google has rolled out its AI-powered conversational search tool, Search Live, to more than 200 countries and territories globally, and is available in 98 languages. First launching in the United States in September 2025, Search Live lets you point your phone or tablet’s camera at something and ask the AI tool out loud, like what model of washing machine you have and how to use it.
The AI then responds with an audio response that is also captioned, and will continue to listen for clarifications and follow-up questions to mimic a natural conversation.
You can access Search Live through the Google app on Android or iOS by tapping the “Live” button below the search bar, placed between the AI Mode and Nano Banana buttons. It is also accessible via Google Lens and the dedicated Gemini app.
Article continues below
Google said the expansion was made possible by the launch of a new audio and voice model called Gemini 3.1 Flash Live, which it said is “inherently multilingual.” The company also claims that the model also responds to queries faster and aims to deliver “more natural and intuitive conversations.”
Analysis: good but not perfect
Search Live uses query splitting – an information retrieval technique that expands the search by examining related answers beyond a specific question – to provide a more complete answer and double the conversational aspect.
We tried Search Live in June last year and noticed that the tool continues to work in the background to use query breakdown. My colleague Eric Hal Schwartz said that the answers “did not seem locked into a single answer form, even for relatively simple queries.”
I tried it myself, testing it on my bike. Although Search Live was able to identify the specific model, year of release, and why it had a specific paint job, it failed to recognize that I had replaced the original wheelset with a third-party set and thought it still had the integrated handlebars that originally came with it. It also failed to correctly identify bike accessories, like my taillight, water bottle, and bottle cages.
In a similar test, it failed to identify the Nothing Phone 4a Pro that was on my desk, instead calling it the Nothing Phone 2a. I compared the results with the same question on Gemini Live and received identical answers.
It’s understandable that some of the results were incorrect, as the AI assistant relied on existing online sources and new products wouldn’t necessarily contain information that the model could learn from, but, as it stands, it can handle quite a few general queries.
According to Google, more than 1.5 billion people were using Google Lens to identify objects around them as of June 2025 and there are approximately 750 million Gemini Live users. So it would be interesting to see what the adoption of Search Live will be globally and if it becomes the default way to search for information online.































