We introduced AI overviews at last year’s I/O, and since then, people have changed how they use Google Search. More users are coming to Google with a wider range of questions, including longer, more complex ones that use different types of input.  

AI in Search makes it easier to ask Google anything and get answers with web links. That’s why AI Overviews is one of our most successful search launches in 10 years. People using AI Overviews are more satisfied with their results and search more often. In the US and India, AI Overviews have increased Google usage — blocking queries that show AI Overviews by over 10%. Once people try AI Overviews, they often return for more searches, and this growth continues to rise. We are also making sure AI Overviews responds as fast as people expect, offering the industry’s quickest AI answers.  

We are advancing search with AI. Today at IO, we shared our latest updates shaping search’s future. Here is what we announced.  

AI Mode In Search: Bringing You The Latest AI Features 

As we introduced AI Overviews, active users asked for a full AI search experience. Earlier this year, we tested AI mode in Search Labs, and we are now launching it in the US. No Labs sign-up needed. AI mode is our most advanced AI search offering better reasoning, support for all input types, and deeper follow-up questions and links in the next few weeks. Look for a new AI mode tab in Search in the Google Apps search bar.  

AI mode uses our query fanout technique, breaking your question into smaller parts and sending multiple searches at once. This helps the search go deeper into the web, finding more relevant content for your question.  

AI mode is where we’ll launch Gemini’s newest features and preview what’s next. As we get feedback, we’ll move many features from AI mode into the main search experience. Starting this week, we are adding a custom version of Gemini 2.5, our smartest model, to search for both AI mode and AI overviews in the US.  

We also previewed advanced features coming to AI mode. These will launch in Labs over the next few months as we gather feedback from our active users. Keep reading to learn more.  

Deep Search In AI Mode: Designed To Help With Research 

If you need more details, we are adding Deep Search tools to AI mode. Deep Search uses the query fanout method, runs hundreds of searches, integrates information from sources, and creates a fully cited, expert-level report in minutes, saving you hours.  

Live Features In Search: Get Real-Time Help 

We’ve improved visual search with Google Lens, now used by over 1.5 billion people each month. Now we’re adding Project Astra’s Live features to Search. With Search Live, you can chat with Search in real time about what you see with your camera.  

If you are stuck on a project, tap the live icon in AI mode or Lens, point your camera, and ask your question. Search will show what you see, explain ideas, offer suggestions, and provide links to resources such as websites, videos, and forums.  

Agentic Features: Let Search Complete Tasks for You 

Many people use search to get things done. So we’re adding project managers, agentic features to AI mode for tasks like buying tickets. For example, ask to find two affordable tickets for this sub-layer’s Reds game in the lower level. AI mode will search across sites, check hundreds of ticket options with real-time prices and availability, and fill out forms for you. It will show you ticket options that meet your criteria, and you can complete the purchase on your chosen site. This saves you time and keeps you in control.   

We are starting with event tickets, restaurant reservations, and local appointments. We are also teaming up with companies like Ticketmaster, StubHub, Resy, and Vagaro to make things work more smoothly.  

Next, Meet Your New AI Shopping Partner Designed to Help You Find the Perfect Products. 

The new AI mode shopping blends Gemini model features with our shopping graph, helping you browse ideas, compare options, and select products. You can upload a photo of yourself to virtually try on billions of apparel listings. When you find something you like, use our new checkout with Google Play — when the price is right, always with your input and approval. For more information, see our shopping post.  

Soon, AI mode will suggest personalized ideas based on your past searches for an even more tailored experience. You can also connect other Google apps, such as Gmail, to provide more personal context. For example, searching for things to do in Nashville this weekend with friends who are big foodies and like music before a trip means AI mode can recommend restaurants with outdoor seating based on your previous bookings and searches. You will also get event suggestions near your hotel using your flight and hotel confirmations. You know when AI mode is using your personal contacts to help; you are in control and can connect or disconnect this feature whenever you want.  

Custom Charts And Graphs To Help You Visualize Data 

If you need help with numbers or visualizing data, an AI model can analyze complex datasets and create custom graphics to answer your questions. For example, if you want to compare the home field advantage for two baseball teams, a search will give you an analysis and an interactive graph using Google’s real-time sports data. This feature will be available for sports and finance questions.  

Finally, AI mode is rolling out to everyone in the US today. All the new features we showed at I/O will be available to Labs users in the coming weeks and months. If you want to try them first, turn on the AI mode experiment in Labs.  
Source: https://blog.google/products-and-platforms/products/search/google-search-ai-mode-update/#shopping 

Amazon

Leave a Reply

Your email address will not be published. Required fields are marked *