17 Sources
17 Sources
[1]
Google's AI Mode Can Now Give You More Visual Inspiration
Imad is a senior reporter covering Google and internet culture. Hailing from Texas, Imad started his journalism career in 2013 and has amassed bylines with The New York Times, The Washington Post, ESPN, Tom's Guide and Wired, among others. Google's AI Mode in Search will start letting users ask questions and get images as answers, Google said in a blog post on Tuesday. AI Mode is the tab in Google Search that allows people to engage directly with a specially tuned version of Google's AI, without needing to go to ChatGPT or Gemini. Google says this visual AI search feature update allows people to find inspiration. Instead of jumping over to Google Images to find relevant photos, you can do all of that in AI Mode. Google offered the example of someone wanting design inspirations for their bedroom who asks to see maximalist bedroom photos. Images are linked, so users can click out and learn more about that image. Users can also iterate on their searches, asking for bright pink tones or bold fonts. AI Mode's visual search capabilities are rooted in Google Lens, which lets people use phone cameras to search for photos or similar items online. AI Mode will also allow people to upload their own photos and search that way. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. Along with being able to see more visuals in AI Mode, Google says it's improving shopping features. Rather than having to sort through filters, shoppers can just say, "barrel jeans that aren't too baggy" and add "I want more ankle length." AI Mode will then list retailer sites where those jeans can be had. Image search in AI Mode brings Google another step closer to making its core online search product an AI product. As people become more accustomed to searching based on intent with AI synthesizing information in the background, the idea of typing in keywords and looking through ten blue links is becoming a relic. Google is leaning heavily on AI to stay competitive with ChatGPT maker OpenAI and to meet Wall Street's pressure for it to remain a top player. Google recently hit the $3 trillion market cap milestone, the fourth company to do so. Already, Google's integrated AI Overviews add AI-generated responses to the top of search on some queries, pushing the classic ten blue links further down the page. Some studies suggest that as searchers get more answers through AI, they are less likely to click on links, which can harm sites that rely on Google Search traffic to survive. Google, however, argues that traffic going to sites via AI Overviews tends to be higher quality, with these types of readers being more engaged. It's put Google in a difficult spot with publishers, on whom the search giant's relied for years to fill its search results. Some publishers believe their content is being captured and trained by Google's AI without their permission. Now, publishers are going so far as to block various AI crawlers and are signing licensing deals for their content with players like OpenAI and Perplexity. Recently, Rolling Stone owner Penske Media sued Google over AI Overviews. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)
[2]
Google's AI Mode image search is getting more conversational
Google's latest AI Mode update makes it easier for users to search through images using vague descriptions and references. The update provides a more conversational way to shop for products online, according to Google's announcement, allowing AI Mode users to "describe what you're looking for -- like the way you'd talk to a friend," instead of using filters to refine Search results by color, size, or brand. "Starting today, you can ask a question conversationally and get a range of visual results in AI Mode, with the ability to continuously refine your search in the way that's most natural for you," Google said in its announcement blog. "You'll see rich visuals that match the vibe you're looking for, and can follow up in whatever way is most natural for you, like asking for more options with dark tones and bold prints." For example, AI Mode users can look for "barrel jeans that aren't too baggy" and then refine the results with additional requests like "I want more ankle length," or "show me acid-washed denim." AI Mode will "intelligently provide a relevant set of shoppable options," according to Google, allowing users to easily visit retailers' sites to purchase items in Search results. Users can also start their search by uploading a reference image or snapping a photograph to fetch visually similar results, or use a mix of images and descriptions to refine their search. The conversational search update works for general visual exploration too, such as finding images for interior design inspiration. The AI Mode visual search upgrade is rolling out in English to US users this week, so it might take a few days for the new capabilities to appear. Google says the update builds on Google Search with Lens and Image search and Gemini 2.5's advanced multimodal and language capabilities, allowing AI Mode to recognize subtle details and secondary objects in images to better understand visual context and deliver more nuanced image search results.
[3]
Google's big AI Mode update adds visual search and shopping - how to try it
This is another attempt to sway users to use Google's AI search engine. Google introduced its AI Mode in the spring, intended to be a spin on the classic Google Search experience that incorporates AI's conversational capabilities into the search engine. Google is now updating the experience to be more visual and even help you shop. On Tuesday, Google announced that AI Mode would now display a range of visual results when users input a conversational search prompt. Once an image is displayed, users can ask follow-up questions about the image or click on the link to learn more. Since the experience is multimodal, users can also start the search experience by uploading an image. Also: Microsoft just added AI agents to Word, Excel, and PowerPoint - how to use them Google attributed the "breakthrough in visual search" to its new visual search fan-out technique, which is an approach similar to the query fan-out technique that powers the AI Mode experience. In the same way that AI Mode breaks down a prompt into related queries to best meet your needs, the visual version analyzes an image and runs multiple queries in the background. Google also launched a new Shopping experience in which users can conversationally ask for a specific product, and AI Mode will show users visual shopping results. If the result isn't what the user intended, they can ask follow-up queries. Once the perfect product is found, the user can click on the retailer's website, where they are led to the option to purchase the product. Also: ChatGPT is crushing rivals in the AI chatbot race by all measures - but for how long? This approach differs from OpenAI's AI-fueled shopping experience, called Instant Checkout. With this feature, users can place an order directly within a conversation with the chatbot, using the same conversational experience to find a product, and most importantly, without having to leave the chatbot interface to complete the order. All Google users are welcome to try AI Mode for free. Also: No, AI isn't stealing your tech job - it's just transforming it You can access AI Mode by visiting google.com/ai, which brings you straight to the AI Mode page, or typing something into the Google Search box and then clicking on the AI Mode logo to the right, highlighted by the colorful ring.
[4]
Google's AI Mode adds images as search giant tries to keep pace with rivals
The search giant rolled out AI Mode in the U.S. in May, launching primarily as a text-based tool that could answer questions using natural language. Now, if users are looking for inspiration or shopping help, AI Mode can also generate results in the form of images. Google has been racing to find ways to integrate generative AI into its search engine since OpenAI rocked the tech sector with the launch of its chatbot ChatGPT in late 2022. As its AI rivals have changed how users can seek information online, Google has had to keep up. The company's updated visual results will provide users with a whole new set of use cases for AI Mode, especially since some queries aren't suited for text-based answers, said Robby Stein, vice president of product management at Google Search.
[5]
Google's AI Mode gets better at understanding visual prompts
Since it began rolling out , Google has been slowly adding features to its dedicated search chatbot. Today, the company is releasing an update it hopes will make the tool more useful for visual searches. If you've tried to use AI Mode since Google made it available to , you may have noticed it responds to questions about images with a lot of text. Robby Stein, vice president of product for Google Search, admits it can be "silly" to see text in that context, and so the company has been working on applying AI Mode's "query fan-out" technique to images. Now, when you prompt AI Mode to find you images of "moody but maximalist" bedrooms for instance, it's better equipped to respond to that request, with an algorithm that will run multiple searches in the background to get a better understanding of exactly what it is you're looking to find. Google has built this feature to be multimodal, meaning you can start a conversation with an image or video. And as you can probably guess, Google believes these capabilities will be particularly useful in a shopping context. You could use AI Mode to shop before today, but Google argues the experience benefits greatly from the more visual responses the chatbot is able to generate. What's more, it's better able to make sense of tricky queries like "find me barrel jeans that aren't too baggy." Once AI Mode generates an initial response, you can ask follow-up questions to refine your search. As with any Google update, it may take a few days for the company to roll out the updates it announced today to everyone. So be patient if you don't see the new, more visual experience right away.
[6]
Google Search AI Mode just got way more visual
You can also set us as a preferred source in Google Search by clicking the button below. Today, Search users in the US can ask a question in AI Mode and get back visual results. Similar to clicking on the Images tab in Search, you'll now be able to get a range of images in response to your query. Each of these images will have a link you can follow to learn more. And you'll be able to ask follow-up questions to further refine your search. Google adds that you can also start an image search in AI Mode by uploading a picture or snapping a photo.
[7]
Google's AI Mode makes search more visual, conversational, and shopping-friendly
In the past, searching for something like "cool chair" on Google often led to a mess of unrelated results and endless filters. Now, Google is changing the way people use Search with a major update to AI Mode, making it more visual, conversational, and flexible. Now, you don't have to rely on exact keywords or awkward filters. According to Google's blog post, you can interact with Search in a more natural way. For example, if you want ideas for a new bedroom, you can type or say something like "maximalist room ideas," and AI Mode will quickly show you a stream of visual inspiration. You can keep refining your search by asking for things like darker tones, a cozier feel, or bold prints until you find what you like. This update lets you mix images and text when searching. Snap a photo of something you like, add a few words to describe what you're after, and Search adapts on the spot. That's especially useful for moments when you don't know the name of an item. Google's latest image understanding upgrades mean the system can now pick up on fine details in photos, like textures, patterns, or secondary objects, so results feel much more precise. On the shopping side, Google is tying these visual searches directly into its Shopping Graph, which connects users with product listings, updated pricing, and reviews in real time. It's designed to cut down on the number of clicks between spotting an idea and finding something you can actually buy. A new technique called "visual query fan-out" broadens results beyond perfect matches, surfacing items and inspiration that are contextually related but not always exact duplicates. Gemini 2.5 powers the visual brain behind AI Mode Underneath all this is Google's Gemini 2.5, which powers AI Mode with stronger multimodal reasoning across text and images. The model has been optimized for faster responses and better efficiency, which not only improves Search but also trickles down into other Google services like AI Studio, Vertex AI, and the Gemini app. For now, the AI-powered visuals are rolling out in English to US users, but expansion is already lined up for Hindi, Japanese, Korean, Indonesian, and Brazilian Portuguese. Beyond Search, Google is also weaving these improvements into Chrome, the Gemini app, and developer tools like AI Studio and Vertex AI, pushing the same brainpower across its ecosystem. For users, search now feels more like exploring ideas with someone who understands both your words and your images. For brands and content creators, visuals, context, and structured data will become even more important for being discovered.
[8]
Google AI Mode adding images to boost visual, shopping results
Google is updating AI Mode to offer a more visual experience when searching for inspiration and shopping. AI Mode responses can now show images alongside text for a richer experience. Getting visual inspiration and shopping are the primary use cases touted today. For example, asking "show me maximalist inspo [inspiration] for my bedroom" will return "rich visuals that match the vibe you're looking for." A key part of this experience is the ability to conversationally refine results, like requesting: "More with dark tones and bold prints." Your follow-ups don't have to be any more specific as chat context is preserved. Each image result is a link that you can visit/expand and "Add to AI conversation." You can also start prompts by uploading an image. On the shopping front, Google equates this new capability as letting you "describe what you're looking for -- like the way you'd talk to a friend -- without having to sort through filters." An example prompt is asking for "barrel jeans that aren't too baggy," with the ability to refine by saying: "I want more ankle length." This takes advantage of the 50+ billion product listings in Google's Shopping Graph. Google is leveraging a technique it calls "visual search fan-out" where AI Mode performs a "comprehensive analysis of an image" to recognize "subtle details and secondary objects in addition to the primary subjects." It then "runs multiple queries in the background" to "understand the full visual context and the nuance of your natural language question to deliver highly relevant visual results." The end result is a grid of images. This breakthrough visual search experience is rooted in our world-class visual understanding of Google Search with Lens and Image search, combined with Gemini 2.5's advanced multimodal and language capabilities. Google says this visual AI Mode is rolling out to US English users starting this week.
[9]
Your Photos May Now Appear on Google's AI Mode
Google is updating its AI-powered search feature, called AI Mode, so that results now include images and photos as well as text. AI Mode launched in the US in May as a text-only tool that could answer questions in natural language. In a blog post published on Tuesday, Google announced that the company has added images to its AI search tool. With this update, AI mode users looking for ideas or shopping help will also see visual results, giving them photos or generated images alongside written answers. For example, if a user searches for "maximalist bedroom design," AI Mode won't just give them descriptions. It will now show visuals that match the style. They can then refine what they see by asking for darker tones or bolder prints. According to Google, each image links out to a source. Users can also start a search by uploading or taking a photo instead of just typing text. This mix of text and images is meant to make AI searches more useful. Users can upload a picture, combine it with a written prompt, and get results that are closer to what they have in mind. "Sometimes what you're looking for really just can't be articulated with text," Robby Stein, vice president of product management at Google Search, tells CNBC. "If you ask about shopping for shoes, it'll describe shoes when really people want visual inspiration, they want the ability to see what the model might be seeing." Google says the new visual results use several of its tools together: Gemini 2.5, Search, Lens, and Image Search. This lets AI Mode analyze not only the main subject of an image but also small details, then run multiple background searches to match what the user is asking. On a smartphone, an AI mode user can even tap into a single image and ask follow-up questions about what they see in the photo. "This is really, we think, a breakthrough in what's possible," Stein adds. The new visual results in AI Mode start rolling out this week in English for US users, and should appear for everyone in the coming days, according to Google.
[10]
Google adds eyes to AI Mode with new visual search features
Like a conversational Pinterest with some online shopping thrown in Google has added a whole new dimension to its Gemini-powered AI search mode with the addition of images to the text and links provided by the conversational search platform. AI Mode now offers elements of Image Search and Google Lens to go with the Gemini AI engine, letting you ask a question about a photo you upload, or see images related to your queries. For instance, you might see someone on the street with an aesthetic you like, snap a picture, and ask AI Mode to "show me this style in lighter shades." Or you oculd ask for "retro 50s living room designs" and see what people were sitting on and around 75 years ago. Google pitches this feature as a way to replace filters and awkward keywords with natural conversation. The visual facet of AI Mode uses a "visual search fan‑out" approach on top of its existing fan-out way of answering questions used by AI Mode. When you upload or point at an image, AI Mode breaks it down into elements like objects, background, color, and texture, and sends multiple internal queries in parallel. That way it comes back with relevant images that aren't restricted to repeating what you've already shared. Then it recombines results that best match your intent. Of course, that also means Google's search engine must decide what results retrieved by the AI to highlight, and when to suppress noise. It may misread your intent, elevate sponsored products, or favor big brands whose images are better optimized for AI. As search becomes more image-centered, sites that lack clean visuals or visual metadata may vanish from the results, making the experience less useful than ever. On the shopping side, all of this taps into Google's Shopping Graph, which indexes over 50 billion products and refreshes every hour. So a picture of a pair of jeans might net you details on current prices, reviews, and local availability all at once. AI Mode turning your vague prompts and visuals into real options for shopping, learning, or even discovering art is a big deal, at least if it performs well. Google's size allows it to meld search, image processing, and e-commerce into one flow. There will be plenty of concerned rivals watching closely, even if they were ahead of Google with similar products. For instance, Pinterest Lens already lets you find similar looks from pictures, and Microsoft's Copilot and Bing visual search let you start from images in some fashion. But few combine a global product database and live price data with conversational searching through images. Specialized apps or niche areas of focus could get ahead, but Google's size and ubiquity mean it will have a massive head start with broader attempts to search for information with images. For years, we've typed queries and parsed results. Now, the direction of online search is toward sensing, pointing, and describing, allowing AI and search engines to interpret not just our words, but also what we see and feel. More aesthetic and design thinking means systems that natively understand visuals are becoming increasingly critical, and AI Mode's evolution suggests the baseline may soon be that search tools should see as well as read. Should missteps creep in, though, all bets are off. If the AI Mode visual results misinterpret intent, mislead users, or show major bias, users may revert to brute-force filtering or more specialized options. The success of this visual AI leap hinges on whether it feels helpful, or unreliable.
[11]
AI Mode can now help you search and explore visually
We've all been there: staring at a screen, searching for something you can't quite put into words. Maybe it's a new "vibe" for your apartment or a specific fall coat -- whatever it is, words just don't cut it. But what if you could just show or tell Google what you're thinking and get a rich range of visual results? With a major new update to AI Mode in Google Search, you can now do just that. Starting today, you can ask a question conversationally and get a range of visual results in AI Mode, with the ability to continuously refine your search in the way that's most natural for you. Let's say you're searching for maximalist design inspiration for your bedroom. Now, AI Mode will help you turn your vague idea into a clear vision. You'll see rich visuals that match the vibe you're looking for, and can follow up in whatever way is most natural for you, like asking for more options with dark tones and bold prints. Each image has a link, so you can click out and learn more when something catches your eye. And because the experience is multimodal, you can also start your search by uploading an image or snapping a photo.
[12]
Google's AI Mode Gets Richer Visual Responses
We may earn a commission when you click links to retailers and purchase goods. More info. The AI Mode that Google introduced to Search back in March is still getting new features every few weeks, a sign that Google is seeing big adoption and use during search queries. In this latest update, Google is adding more visual responses that can be continuously refined. Google shared an example of these visual responses by having AI Mode "Show me maximalist inspo for my bedroom," which brought back a short text description followed by a number of images that were links to sites with that kind of bedroom design. As a follow-up, they asked for "More with dark tones and bold prints" to refine the search further. It's pretty simple stuff upfront that should get you more imagery and visuals during a search that probably would benefit from that time of return. This is some sort of "breakthrough" in visual search experiences that Google is using Search, Lens, and Image search to get these results. Of course, it's also powered by Gemini 2.5 and a "visual search fan-out"technique that allows for deeper understanding, should you start a search with an image. But really, all you need to know is that AI Mode can be more visual now for those times when you'd rather see sweet pics than text.
[13]
Google's "AI Mode" Search Tool Gets a New Upgrade - Phandroid
Google recently announced that it's bringing over a new update to AI Mode in Google Search, which will now give users access to a wider range of visual results simply by speaking or showing the platform what they're looking for. Powered by Gemini 2.5's advanced multimodal and language capabilities, the new update uses visual understanding technologies like Lens, and is now rolling out in English for users in the U.S. starting this week. READ: Google Chat Refine Feature Uses Gemini AI to Perfect Your Messages Google says that the new update is built on a new "visual search fan-out" technique, which greatly enhances Google's ability to understand images. This allows AI Mode to perform a comprehensive analysis of a photo by recognizing subtle details and secondary objects in addition to the main subject. By running multiple queries in the background, the system then gains a full grasp of the visual context and the user's natural language question for more relevant results. With the new feature, users can begin a search with specific details (i.e. clothing styles, interior design inspirations, etc), and AI Mode will return more relevant visuals based a user's query. The search can be continuously refined in a more conversational approach, with back-and-forth dialogue including follow-up requests. As it is multimodal in nature, users can also start a search by uploading or snapping a photo, and any images provided will come with a link to the source. Furthermore, users on mobile devices can search within a specific image and ask conversational follow-up questions about what they are seeing. For shopping, users can describe what they are seeking in a more causal manner, instead of manually setting filters to fit specific criteria. AI Mode will then provide a set of options available on the market, and users can further give additional details on what they're looking for, such as product colour, pricing, and more.
[14]
Google's AI Mode Will Now Show You AI-Generated Visual Results
The feature also enables visual shopping with refined options Google announced a new feature in AI Mode on Tuesday, which adds visual exploration capabilities alongside the existing text-based experience. As per the Mountain View-based tech giant, this feature will enable users to search for and find something specific that they cannot express in words, courtesy of visual results in AI Mode. They can then further fine-tune the search results with prompts or ask follow-up questions about what's on their screen. Google to Offer Range of Visual Results in AI Mode With the new feature, users can ask a question conversationally, and AI Mode will provide a range of visual results. Sharing an example, Google says that users can ask "maximalist design inspiration for your bedroom". Visual results matching the vibe of what they're looking for will be generated, alongside the usual text results. Each image will carry a link, enabling them to learn more about the result. They can fine-tune the results with natural language, such as asking for more options with darker tones or bolder prints. A similar capability is also coming to shopping experiences. As per Google, it will eliminate the need to sort through filters and get the desired result directly, with visual shopping results. Instead of going through different sorting filters to find the right style, colour, or brand of the product they're searching for, AI Mode will provide a relevant set of shopping options for visual exploration. For example, users can search for a pair of jeans with a prompt like, "barrel jeans that aren't too baggy". They also have the option to refine the original result, such as asking for more ankle length or a different rise and colour. How visual results in AI Mode work Photo Credit: Google Google said that this experience is multimodal, which means it also accepts image-based prompts. AI Mode's visual capabilities are powered by the same multimodal and language capabilities of Gemini 2.5 large language model (LLM), combined with Google Search with Lens and Image search features. This enables AI Mode to carry out a comprehensive analysis of the image, such as recognising subtle details and secondary objects. To break down queries, the AI feature leverages a query fan-out approach. It breaks them down into subtopics and runs multiple, simultaneous searches for each of them, as per the company. Google said that this helps AI Mode's new feature understand the full visual context and nuance of the natural language question entered by the user, to deliver highly relevant visual results. The visual exploration capabilities in AI Mode are rolling out in English for users in the US this week. There's no word on its availability in other countries or languages. AI Mode, notably, is available on both the website view and via the Google app. It can also be accessed by taking a photo of an object with Google Lens and then redirecting it via AI Mode for a more comprehensive result.
[15]
Google Intros Conversational AI Shopping and New Visual Search | PYMNTS.com
Now, users who ask a question conversationally can get a range of visual results, and refine their search in the manner most natural to them, the tech giant said Tuesday (Sept. 30). "Let's say you're searching for maximalist design inspiration for your bedroom. Now, AI Mode will help you turn your vague idea into a clear vision," Google wrote on its blog. "You'll see rich visuals that match the vibe you're looking for, and can follow up in whatever way is most natural for you, like asking for more options with dark tones and bold prints." Each image comes with a link, the most continues, letting users click out and learn more when something gets their attention. The experience is also multimodal, letting users begin their search by uploading an image or snapping a photo. People who want to shop using this new method can describe what they're searching for -- similar to the way they'd speak with a friend -- without having to sort through filters, and Google will show them visual shopping results. "Perhaps you're looking for the best weekend jeans for fall," the blog post said. "Rather than sifting through filters to find the right style, rise, color, size or brand, just say 'barrel jeans that aren't too baggy,' and AI Mode will intelligently provide a relevant set of shoppable options." Research by PYMNTS Intelligence has shown that AI shopping adoption is already gaining traction among younger and middle-aged consumers, with roughly one-third of all respondents saying they have used or would use generative AI for shopping. "Overall, 32% of people surveyed said they used gen AI for shopping," PYMNTS wrote last month. "The use case reporting the highest percentage was work at 40%, followed by creative endeavors and educational purposes." Aside from Google's new offering, other companies entering into this space include Amazon, whose "Buy For Me" feature allows shoppers to order merchandise from other websites while remaining inside Amazon's platform. PayPal has teamed with Perplexity to serve as the embedded checkout option within the AI chatbot, while Visa and Mastercard are rolling out agentic commerce. More recently, OpenAI said it would begin allowing American ChatGPT users to make purchases from Etsy and some Shopify merchants within the chatbot.
[16]
Google Introduces Conversational Features in AI Image Search
Explore visual search seamlessly for products, landmarks, and memories. Capturing life's precious moments through pictures has become second nature. However, with the massive volume of images flooding our feeds, the excitement of clicking and editing photos has somehow worn off. Google's latest innovation, with its AI image search and conversational feature, is completely changing how we interact with and discover visual content. This advanced technology simplifies image search, making it more intuitive and personalized than ever before.
[17]
Google's AI Mode expands with visual search capabilities By Investing.com
Investing.com -- Google has launched a major update to AI Mode in Search that enables users to explore and shop visually through conversational interactions. The new feature, rolling out this week in English across the United States, allows users to ask questions conversationally and receive visual results that can be continuously refined through natural language. Users searching for design inspiration can now receive rich visuals matching their described preferences. For example, someone looking for "maximalist design inspiration" for a bedroom can follow up by requesting options with specific elements like dark tones or bold prints. The update also enhances shopping experiences by eliminating the need to sort through multiple filters. Users can describe products conversationally, such as asking for "barrel jeans that aren't too baggy," and receive relevant shoppable options that can be further refined with additional requests. Each image displayed includes a link for users to learn more about items of interest. The system is powered by Google's Shopping Graph, which contains over 50 billion product listings from retailers worldwide, with approximately 2 billion listings refreshed hourly. The technology combines Google's visual understanding capabilities from Lens and Image search with Gemini 2.5's multimodal and language capabilities. Google has implemented a "visual search fan-out" technique that performs comprehensive image analysis, recognizing subtle details and secondary objects beyond primary subjects. On mobile devices, users can also search within specific images by asking follow-up questions about what they're seeing. This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
Share
Share
Copy Link
Google has updated its AI Mode in Search with new visual capabilities and improved shopping features. This update allows users to receive image-based responses and shop more intuitively using natural language queries.
Google has announced a significant update to its AI Mode in Search, introducing new visual capabilities and improved shopping features. This update marks a step forward in Google's efforts to integrate AI more deeply into its core search product, aiming to keep pace with competitors like OpenAI's ChatGPT
1
.The new update allows users to ask questions and receive image-based responses within AI Mode. This feature is designed to provide visual inspiration and make the search process more intuitive. Users can now start their search with text queries or by uploading images, and then refine their results through conversational follow-up questions
2
.Google's "visual search fan-out" technique powers this new capability, analyzing images and running multiple queries in the background to better understand visual context and deliver more nuanced results
3
.The update also introduces a more conversational approach to online shopping. Users can describe products they're looking for in natural language, without relying on specific filters or keywords. For example, shoppers can ask for "barrel jeans that aren't too baggy" and then refine the results by requesting "more ankle length" options
1
.AI Mode will present a set of shoppable options with rich visuals, allowing users to easily visit retailer sites to make purchases
2
.This update represents Google's ongoing efforts to transform its traditional search engine into an AI-powered product. As users become more accustomed to intent-based searches with AI synthesizing information, the classic "ten blue links" model is becoming less relevant
1
.However, this shift has raised concerns among publishers who rely on Google Search traffic. Some studies suggest that as more answers are provided through AI, users are less likely to click on links, potentially harming sites that depend on search traffic. Google counters this by arguing that traffic from AI-generated overviews tends to be of higher quality, with more engaged readers
1
.Related Stories
The new visual search and shopping features in AI Mode are rolling out to English-speaking users in the United States this week. Users can access AI Mode by visiting google.com/ai or by clicking on the AI Mode logo next to the search box on Google's homepage
3
.This update is part of Google's ongoing efforts to compete with AI rivals like OpenAI and meet Wall Street's expectations. As the fourth company to reach a $3 trillion market cap, Google is under pressure to maintain its position as a top player in the AI and search markets
4
.As AI continues to reshape the search landscape, we can expect further innovations and updates from Google and its competitors in the coming months.
Summarized by
Navi
1
Business and Economy
2
Business and Economy
3
Technology