{"id":421034,"date":"2025-12-03T04:34:11","date_gmt":"2025-12-03T04:34:11","guid":{"rendered":"https:\/\/www.europesays.com\/us\/421034\/"},"modified":"2025-12-03T04:34:11","modified_gmt":"2025-12-03T04:34:11","slug":"google-tests-merging-ai-overviews-with-ai-mode","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/us\/421034\/","title":{"rendered":"Google tests merging AI Overviews with AI Mode"},"content":{"rendered":"<p id=\"speakable-summary\" class=\"wp-block-paragraph\">As OpenAI <a href=\"https:\/\/www.wsj.com\/tech\/ai\/openais-altman-declares-code-red-to-improve-chatgpt-as-google-threatens-ai-lead-7faf5ea6?gaa_at=eafs&amp;gaa_n=AWEtsqfMT3qTSlvhowQFmOjRpLZ7QIOR0qfLHu_GMZYCE1yjzHnWTi5DaW0oV9gRDoc%3D&amp;gaa_ts=692f5dc6&amp;gaa_sig=w-Cu9Vo2SEqidTuhPuzS1r3J0b_PNokOUlZpjXoT-hMig3Tc_T2OlDxE29UqrIaB5WjJu6u5m6ybuNUteJqDTA%3D%3D\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">goes into \u201cCode Red<\/a>\u201d over competitive pressures, Google announced it has begun testing a new feature that merges its <a href=\"https:\/\/search.google\/ways-to-search\/ai-overviews\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">AI Overviews<\/a> with AI Mode in Search. That means that users who are provided with the now common AI-generated snapshot of key information on a topic or question above their search results can choose to go deeper by asking follow-up questions in a conversational interface.<\/p>\n<p class=\"wp-block-paragraph\">Google calls this conversational feature <a href=\"https:\/\/search.google\/ways-to-search\/ai-mode\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">AI Mode<\/a>. It <a href=\"https:\/\/techcrunch.com\/2025\/05\/20\/googles-ai-mode-rolls-out-to-us-will-add-support-for-deeper-research-comparison-shopping-and-more\/\" target=\"_blank\" rel=\"noopener\">launched to U.S. users this May<\/a>, and <a href=\"https:\/\/techcrunch.com\/2025\/08\/21\/googles-ai-mode-expands-globally-adds-new-agentic-features\/\" target=\"_blank\" rel=\"noopener\">to global users this August,<\/a> allowing for back-and-forth chats with Google\u2019s Gemini AI, in an experience similar to ChatGPT.<\/p>\n<p class=\"wp-block-paragraph\">However, accessing the experience so far has required you to think ahead about what type of question you were preparing to search for. If it were a more traditional search query, or one where you could expect to get a quick answer, you\u2019d likely stick with typing into the search box as usual.<\/p>\n<p class=\"wp-block-paragraph\">But if you expected to ask more questions or explore a topic in more detail, you\u2019d have to click over to the AI Mode tab to start chatting with the AI instead.<\/p>\n<p class=\"wp-block-paragraph\">Google now wants to test whether or not it makes sense to differentiate the two experiences. After all, the process of information seeking can often lead to a desire to learn more. You may have thought you were starting a simple query, only to find yourself delving deeper into the topic.<\/p>\n<blockquote class=\"twitter-tweet\" data-width=\"500\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">(1\/2) Today we\u2019re starting to test a new way to seamlessly go deeper in AI Mode directly from the Search results page on mobile, globally.<\/p>\n<p>This brings us closer to our vision for Search: just ask whatever\u2019s on your mind \u2013 no matter how long or complex \u2013 and find exactly what you\u2026 <a rel=\"nofollow\" href=\"https:\/\/t.co\/mcCS7oT2FI\">pic.twitter.com\/mcCS7oT2FI<\/a><\/p>\n<p>\u2014 Robby Stein (@rmstein) <a rel=\"nofollow noopener\" href=\"https:\/\/twitter.com\/rmstein\/status\/1995572911093289055?ref_src=twsrc%5Etfw\" target=\"_blank\">December 1, 2025<\/a><\/p><\/blockquote>\n<p class=\"wp-block-paragraph\">With the new test, announced on Monday, Google says users will be able to \u201cseamlessly go deeper\u201d in AI Mode directly from the Search results page. While the test is rolling out to users globally, it\u2019s only available on mobile devices for the time being. <\/p>\n<p class=\"wp-block-paragraph\">The rollout comes alongside a push inside Google\u2019s AI rival, OpenAI, which is now delaying other products to focus on improving the chatbox experience. Thanks in part to the release of Gemini\u2019s Nano Banana image model and other Gemini improvements, Gemini has grown <a href=\"https:\/\/blog.google\/products\/gemini\/gemini-3\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">to over 650 million monthly users<\/a> as of November. Merging the conversational mode with AI Overviews, which <a href=\"https:\/\/techcrunch.com\/2025\/07\/23\/googles-ai-overviews-have-2b-monthly-users-ai-mode-100m-in-the-us-and-india\/\" target=\"_blank\" rel=\"noopener\">has 2 billion monthly users<\/a>, could give Gemini an edge in consumer adoption.<\/p>\n<p class=\"wp-block-paragraph\">Notes VP of Product for Google Search Robby Stein, <a href=\"https:\/\/x.com\/rmstein\/status\/1995572911093289055\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">in a post on X<\/a>, \u201cYou shouldn\u2019t have to think about where or how to ask your question.\u201d Instead, he explained, users will continue to get an AI Overview as a helpful starting point, but will then be able to ask conversational follow-up questions in AI Mode from the same screen.<\/p>\n<p class=\"wp-block-paragraph\">\u201cThis brings us closer to our vision for Search: just ask whatever\u2019s on your mind \u2013 no matter how long or complex \u2013 and find exactly what you need,\u201d Stein wrote. <\/p>\n<p class=\"wp-block-paragraph\">Check out the latest reveals on everything from agentic AI and cloud infrastructure to security and much more from the flagship Amazon Web Services event in Las Vegas. This video is brought to you in partnership with AWS.<\/p>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n","protected":false},"excerpt":{"rendered":"As OpenAI goes into \u201cCode Red\u201d over competitive pressures, Google announced it has begun testing a new feature&hellip;\n","protected":false},"author":3,"featured_media":421035,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[7],"tags":[302,39852,2722,158,67,132,68],"class_list":{"0":"post-421034","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-technology","8":"tag-chatgpt","9":"tag-gemini","10":"tag-google","11":"tag-technology","12":"tag-united-states","13":"tag-unitedstates","14":"tag-us"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@us\/115653775504768102","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/421034","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/comments?post=421034"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/421034\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media\/421035"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media?parent=421034"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/categories?post=421034"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/tags?post=421034"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}