The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Thu, 20 Feb, 8:06 AM UTC
18 Sources
[1]
Android's Circle to Search Is Now on iPhone (Kind Of)
You can now use a feature that's very similar to Android's "circle-to-search" on iPhone. While it's not exactly called that, both the Chrome app and the Google app will now allow you to circle text or images to search whatever's in your selection using Google Lens. The limitation? On Android, you can use the circle-to-search feature across your entire phone, but on iPhones, this feature is limited to the Chrome and Google apps. When you've got a webpage loaded in the iPhone Google app, press the three-dots button in the top-right corner and tap Search this Screen. In Chrome for iOS, open any webpage, tap the three-dots button in the bottom-right corner, and select Search Screen with Google Lens. You'll then see a gradient layer over the webpage, where you can either circle, tap, or highlight the part you wish to search. Google will then use Lens to identify what you're looking at and show search results accordingly. I used it to look up house crows, so I could learn more about a bird that's made a habit of sitting at my window and demanding food, and check what it might like to eat (pretty much everything, it turns out). The new feature is quite fast on both Chrome and the Google app, and it handles multiple different types of queries pretty well. That's to be expected, since Google Lens has been around for a long time and is generally quite dependable. This is really just a new, more intuitive way of selecting subjects to search. I also tried multiple different gestures instead of circling (everything from a tap to drawing an octagon), and all of them worked quite well. If you want to search for one word, tapping or highlighting works best, but for a larger block of text or a part of an image, you can use other gestures. Google does say that you can use any gesture that feels natural, so feel free to use anything that you like. Google also announced that you'll be seeing more AI overviews in Lens search results. You won't be able to block these results here because they're in Google's own apps, but there are many ways to block AI in your favorite browser.
[2]
Google Lens adds a cool search trick to iPhones - how to try it
For the past year, Google has offered Android users a cool feature known as Circle to Search, in which you can select an item on the screen to run a search on it. Now, iPhone users can tap into that same capability. Also: 5 Google Lens tricks to level up your image search In a blog post published Wednesday, the search giant announced a couple of new powers for Google Lens. First up is the Circle to Search option, though, as Android Police points out, Google isn't officially dubbing it that for its debut on the iPhone. Whatever it's called, here's how it works. Open the Chrome or Google app on your iPhone. Browse to a web page that contains text, an image, or other content that you'd like to learn more about. In Chrome, tap the three-dot icon, swipe to the bottom of the screen, and select "Search screen with Google Lens." In the Google app, tap the three-dot icon to the right of the address bar and select "Search this screen." Tap, draw on, or circle the item that has triggered your curiosity. In response, Google takes a snapshot of the content you selected and runs a search on it, showing you the results at the bottom half of the screen. Also: This new Google Lens feature saves you time while shopping IRL - here's how The goal behind the new capability is to save you time and effort. In the past, you'd have to take a photo or screenshot of an item yourself and then run a separate search on it. Like the Circle to Search option on Android, the new feature for iOS performs the same task more quickly and easily. "Whether you're reading an article, shopping for a product, or watching a video, you can use this feature to quickly perform a visual search while browsing, without having to take a screenshot or open a new tab," Google product managers Jenny Blair and Nick Kim Sexton said in the blog post. The AI Overview feature, already available through regular Google searches, is now accessible through the new Lens search capability, both for iOS and Android. In this case, Google will display an overview with a description and key aspects of the topic. Run the Google Lens search on your iPhone or use the Circle to Search on Android, and then select an item on the screen. Depending on the content, an AI Overview may then appear above the usual search results. The overview might also pop up if you snap a photo using Google Lens. Also: You can use Google Lens with your voice to search. Here's how - and why it's so useful "For many years, Lens has been able to identify billions of different objects -- like plants, products, or landmarks -- by matching against a database of images indexed from the web," Blair and Sexton said. "But now, with help from our advanced AI models, Lens can go much further and provide information on the contents of more novel or unique images. For those kinds of queries, AI Overviews will begin to appear more often in your Lens results, with no need to add a question to your visual search."
[3]
Google Brings Circle to Search to iPhone in the Google and Chrome Apps
Nelson Aguilar is an LA-based tech how-to writer and graduate of UCLA. With more than a decade of experience, he covers Apple and Google and writes on iPhone and Android features, privacy and security settings and more. If you're an iPhone user, you can now use Google's Circle to Search feature -- now dubbed Google Lens on iOS -- which lets you select and search what's on your screen by drawing, highlighting or tapping an object. The AI-powered feature works in both the Google and Chrome apps on iOS. Whether you're reading an article with photos or watching a video, you can now quickly use a gesture to search for what you've seen on Google and get more information via web search or even AI overviews. Previously, if you wanted to search for what's on your screen, you only had the option to take a screenshot of what you saw and upload it to Google, but obviously this new approach is much quicker and easier. Check out: There's a Way to Get Google's Circle to Search AI Feature on Your iPhone The Google Lens feature is arriving this week to the Google app on iOS, and will be coming soon -- no date specified -- to Chrome's mobile and desktop apps. It will be available to English-language users in countries where AI Overviews are available. If you want to test out the feature on your iPhone, here's everything you need to know. If you're in the Google app and see an an object that you want to learn more about, tap the three-dot menu and then select "Search Screen with Google Lens" or "Search this Screen," respectively. The screen will sparkle all over, with the words Google Lens at the top of the screen. You can now use whatever gesture you're most comfortable with: Draw around the object, tap on an object or highlight any text. Once you do, a web search window appears at the bottom, giving you more information about what you've just searched for, as well as any visual matches. If you want to add more context to your search to better refine it, type into the search bar that says "Add to search." For example, if you use Circle to Search to highlight a pizza to find recipes for it online, you can add something like "vegetarian recipe" to refine your search. In addition to this feature, Google will soon let iPhone users use the camera icon in Google Search to snap a photo and get an AI overview so that you can quickly get more information about whatever you're looking at, whether it's a car, building or statue.
[4]
Chrome and Google iPhone Apps Are Getting Circle-To-Search
One of the most recent, and useful, improvements we've seen on Android phones is the addition of Circle to Search, which lets you circle anything on your screen to search it on the web. Now, thanks to Google Lens, iPhones are getting pretty close to that. Google has announced two big updates to its visual search tool, Google Lens, for iPhone users. The most important one is the fact that a new screen search feature is being introduced within both the Chrome and Google apps. This eliminates the need for screenshots or opening new tabs when conducting a visual search. Users can now select any on-screen element -- text, image, or video content -- by drawing, highlighting, or tapping. Within Chrome, the feature is accessed via the three-dot menu, with a dedicated Lens icon soon to be added to the address bar, mirroring the desktop functionality introduced last year. The Google app offers a similar process, initiated through the three-dot menu and a "Search this Screen" option. The feature itself is very reminiscent of what we've seen on Android with Circle to Search, with some caveats. Since it's not Android and the operating system is not controlled by Google, we, of course, don't have systemwide access to this feature. Instead, it's only available within Chrome and the Google app. That's probably good enough if you need to look up something else while browsing the web, but does mean that you won't get access to this on other apps. This is not up to Google, though -- this is probably the best implementation the company could come up with given it's iOS. The second major update involves a greater integration of AI Overviews within Lens results. Previously, Lens primarily identified objects by matching them against a database of known images. Now, with the help of advanced AI models, Lens can provide information on more unique or unusual images. When a user searches for something less common, Lens will display an AI-powered overview, providing a quick summary and links to further resources. The AI does not require the user to add any question to a visual search. This, again, brings this more in line with what Circle to Search provides -- you could already enjoy this functionality by circling something on your Android phone that Google might not have in its database. This feature is rolling out to iPhone users from next week. Source: Google
[5]
Google Chrome is adding a Circle to Search-like feature for iPhones
Jess Weatherbed is a news writer focused on creative industries, computing, and internet culture. Jess started her career at TechRadar, covering news and hardware reviews. Google is rolling out new search gestures that allow iPhone users to highlight anything on their screen to quickly search for it. The Lens screen-searching feature is available on iOS in both the Google app and Chrome browser and provides a similar experience to Android's Circle to Search, which isn't supported on iPhones. The new Lens gestures allow iPhone users to search for anything in the Google app or Chrome by drawing, highlighting, or tapping on it. The feature works across text, images, and videos, without having to take a screenshot or open a new tab. An obvious use case is finding shopping results based on images of products you like, but Lens can also define words and phrases; identify locations, plants, and animals; and perform almost any request that Google Search can. The functionality is essentially the same as Circle to Search, though the Android version can be used across your entire phone instead of just the two Google apps. Not every Android device supports it, however, as it's mostly limited to recent flagships. To use the new Lens gestures, iPhone users need to open the three-dot menu within the Google or Chrome apps and select "Search Screen with Google Lens." You can then use "any gesture that feels natural" to highlight what you want to search. Google says a new Lens icon for quickly accessing the feature will also be added to the address bar "in the coming months." AI Overviews are also expanding to more Lens search results, which means you'll sometimes see AI-regurgitated summaries and URLs when using the image search tool. Google frustratingly doesn't let you disable the AI Overviews feature. There are a few ways around it that help to avoid the wall of text that appears before your actual search results, but it's unclear if these solutions will work with Lens.
[6]
Google brings a 'Circle to Search'-like feature to iPhone users | TechCrunch
Google on Wednesday is rolling out a new update that lets users search what's on their screen via a simple gesture when browsing within the Google Chrome or Google Search app on iOS. The feature is similar to Android's built-in "Circle to Search" feature which also lets users search what's on their screen using a variety of gestures. At launch, iPhone users will also be able to search what's on their screen by drawing, highlighting, or tapping something, via Google Lens. For instance, if you're reading an article and come across an image of an interesting art piece, you can now use Lens to quickly circle or tap on the image to learn more about it. Or, if you're watching a video and see an item you like, you can use Lens to find something similar by circling it. The idea behind the feature is to allow users to quickly perform a visual search while browsing without having to take a screenshot or open a new tab, but it also gives users another way to kick off a traditional web search -- an area of its business that could be impacted by the adoption of AI technology over the long-term. After you highlight or tap something on your mobile device's screen, you'll be shown visual matches and other related results. You can then tap the "Add to your search" option to refine by color, brand, or another detail. You can also ask a follow-up question to learn even more about a topic. To access the new feature in the Chrome or Google app, you need to open the three-dot menu and select "Search Screen with Google Lens." In the coming months, you will be able to access the feature through a new Lens icon in the address bar, Google says. The update will continue to roll out this week and will be available globally on iOS. In addition, Google announced that it's expanding AI Overviews, which display a snapshot of information at the top of the results page, to more of its Google Lens search results. In the past, Google displayed AI Overviews in Lens searches that included both images and text. With this latest update, users will begin to see AI Overviews without adding additional text or questions to their searches. For instance, if you come across an interesting-looking car, you can take a picture of it and then get an AI Overview to help you quickly learn more about what you're looking at, and get links to helpful resources on the web. This update is rolling out this week for English-language users in countries where AI Overviews are available, starting with the Google app for Android and iOS. The update will soon roll out to Chrome on desktop and mobile devices, the company says.
[7]
Google Chrome on iPhone gets circle to search-like feature: How it works
In the coming months, Google will also introduce a Lens icon in the address bar. Google is making it easier to search for things directly from your screen with a new Google Lens update for iPhones. This feature, similar to Android's circle to search, allows users to perform visual searches without taking a screenshot or opening a new tab. Whether you're reading an article, shopping online, or watching a video, you can now search instantly using simple gestures like drawing, highlighting, or tapping on the screen. "If you have an iPhone, you'll find a new Lens option that lets you select and search what's on your screen within Chrome or the Google app, using whatever gesture comes naturally," Google announced in a blogpost. In Chrome, this feature can be accessed through the three-dot menu by selecting "Search Screen with Google Lens." In the coming months, Google will also introduce a Lens icon in the address bar, making it even easier to use, just like on Chrome for desktops. Also read: Google faces $12.4 mn fine in Indonesia, here's why The same functionality is available in the Google app for iOS. By tapping the three-dot menu and selecting "Search this Screen," users can highlight or tap on anything they want to search. This update eliminates the need for extra steps, allowing for a seamless search experience without leaving the current app. Google Lens has been identifying objects like plants, products, and landmarks for years. Now, with advanced AI models, it can analyze and provide information on more unique and unfamiliar images. Instead of requiring users to type a question, AI Overviews will now appear more often in Lens results, offering detailed explanations automatically. Also read: Google accused of using Claude in Gemini AI testing without consent, the company responds For example, if you see a car with an unusual texture on its hood and want to know more about it, you can tap the camera icon in the Search bar, snap a photo with Lens, and receive an AI Overview with insights and relevant web links. This update is rolling out this week for English-language users in regions where AI Overviews are supported. It will first be available on the Google app for Android and iOS, followed by Chrome for desktop and mobile devices.
[8]
Google Lens for iPhones brings Android's Circle to Search in all but name
You can now buy books straight from the Google Play Books app on your iPad Summary iOS users can now search what's on their screen directly within Chrome and the Google app using Lens, eliminating the need for screenshots. The new feature mimics the CtS experience on Android, albeit without the trigger and the branding. Google also announced that users should start seeing more AI Overviews withi their Lens search queries in the near future. Google Lens is a standout tool that has taken a serious hit since Circle to Search (CtS) came around. The latter is a tool that offers an undoubtedly more streamlined experience, especially since it is easier to use and can be triggered on any screen on Android devices. However, Lens is still the more dominant tool when it comes to real-world searches, and that is mainly why the tool still averages over 20 billion visual searches per month, as highlighted by Google. Related Google gives Lens a clearer reason to exist alongside Circle to Search Lens and CtS find their roles Posts 1 When launching Lens, the app now automatically surfaces the camera view for instant visual search of your surroundings. Previously, the app defaulted to the gallery view. Now, in a bid to help the tool retain some of its on-screen search dominance, Lens seems to be focusing on iOS -- where Circle to Search isn't an option. As highlighted by the tech giant in a blog post today, it is introducing a new Search Screen with Google Lens tool for Chrome and the Google app on iOS, essentially mimicking CtS without the branding and the trigger. Close Starting this week, iOS users globally will be able to access 'Search Screen with Google Lens' via the three-dot menu on both the Google app and Chrome, essentially eliminating the need for screenshots when attempting to analyze on-screen context via Google Lens. The tool's UI is identical to CtS on Android, and it will allow iOS users to circle, highlight, or tap on-screen elements to run a Google Search on them -- complete with the Add to search option seen on CtS. Further, in the coming months, a Search Screen with Google Lens icon will start appearing within the address bar on Chrome and the Google app, similar to the feature's implementation on Chrome on the web. Related Circle to Search lands on Chrome for desktop with two exclusive AI features in tow Chrome calls it Google Lens but it's Circle to Search Posts Elsewhere, get ready to see more AI Overviews in your Lens search results, on both Android and iOS. "Perhaps you come across an interesting-looking car and want to learn more about the strange texture on its hood. Just tap the camera icon in the Search bar to snap a photo with Lens. You'll get an AI Overview to help you quickly make sense of what you're looking at, along with links to helpful resources on the web," wrote the tech giant. This is rolling out now for all Google app users in countries where AI Overviews are available. The functionality will extend to Lens searches via Chrome on desktop and mobile "soon."
[9]
Google Lens just got an upgrade on your iPhone. Here's how it works
If you use the Chrome or Google apps on your iPhone, there's now a new way to quickly find information based on whatever is on your screen. If it works well, it could end up saving you time and make your searches a little bit easier. The update concerns Google Lens, which lets you search using images rather than words. Google says you can now use a gesture to select something on-screen and then search for it. You can draw around an object, for example, or tap it to select it. It works whether you're reading an article, shopping for something new, or watching a video, as Google explains. The best iPhones have had a similar feature for a while, but it's always been an unofficial workaround that required using the Action button and the Shortcuts app. Now, it's a built-in feature in some of the most popular iOS apps available. Both the Chrome and Google apps on iOS already have Google Lens built in, but the past implementation was a little clunkier than today's update. Before, you had to save an image or take a screenshot, and then upload it to Google Lens. That would potentially involve using multiple apps and was much more of a hassle. Now, a quick gesture is all it takes. When you're using the Chrome or Google apps, tap the three-dot menu button, then select Search Screen with Google Lens or Search this Screen, respectively. This will put a colored overlay on top of the web page you're currently viewing. You'll see a box at the bottom of your screen reading, "Circle or tap anywhere to search." You can now use a gesture to select an item on-screen. Doing so will automatically search for the selected object using Google Lens. The new gesture feature will roll out globally this week and will be available in the Chrome and Google apps on iOS. Google also confirmed it will add a new Lens icon in the app's address bar in the future, which will give you another way to use gestures in Google Lens. Google added that it is also leveraging artificial intelligence (AI) to add new abilities to Lens. This will let it look up more novel or unique subjects, and doing so will mean Google's AI Overviews appear more frequently in your results. This feature will also be rolled out this week and is coming to English-language users in nations where AI Overviews are available. For now, it's set to arrive in the Google app for Android and iOS first, with Chrome desktop and mobile availability arriving later.
[10]
Google Lens for iPhone now lets you draw to do visual searches
Google is introducing two small but meaningful enhancements to its Lens technology. To start, Chrome and Google app users on iPhone can now draw, highlight or tap on text and images to carry out a visual search of what they see in front of them. If this sounds familiar, it's because Google is basically bringing over an interface paradigm it debuted last year with Circle to Search on Android to iPhone. While the implementation is different and more limited due to the constraints of iOS, the idea is the same: Google wants to save you the trouble of opening a new Chrome tab or saving a screenshot when you want to find more information about an image you see. For now, Google says you can access the new feature, whether you're using Chrome or the Google app, by opening the three-dot menu and selecting "Search Screen with Google Lens." In the future, the company will add a dedicated Lens shortcut to the address bar in Chrome. Separately, the next time you use Lens, you'll be more likely to encounter Google's AI Overviews, particularly when you use the software to find information on more unique or novel images. In those instances, you won't need to prompt Lens with a question about the image you just snapped for the software to try and offer a helpful explanation of what you're seeing. Instead, it will do that automatically. Ahead of today's announcement, Harsh Kharbanda, director of product management for Google Lens, gave me a preview of the feature. Kharbanda used Lens to scan a photo of a car with an usual surface on its hood. An AI Overview automatically popped up explaining that the car had a carbon vinyl wrap, which it further said people use for both protection and to give their rides a more sporty appearance. According to Kharbanda, Google will roll out this update to all English-language users in countries where AI Overviews are available, with the feature first appearing in the Google app for Android and iOS, and arriving soon on Chrome for desktop and mobile devices.
[11]
Google Lens Update Lets You 'Search Your Screen' on iPhones
Google is rolling out a new search experience this week for iOS, which lets iPhone users look up information about a portion of their screen with AI. Google Lens can already perform searches with photos. This new functionality adds the ability to search a visual on the screen -- not from a new photo. It's helpful if you're reading about one topic, but something else pops up on the screen that you want a quick explanation of without having to start a disruptive new search. "Whether you're reading an article, shopping for a product, or watching a video, you can use this feature to quickly perform a visual search while browsing, without having to take a screenshot or open a new tab," says Google. It's only available within Chrome and Google on iOS, and it's a part of the Lens functionality. To try it, tap the three-dot menu in Lens and select "Search this Screen." Then, you can highlight a portion of the screen by "using whatever gesture comes naturally," such as drawing a circle around it or tapping it. It could be considered a limited version of Google's Circle to Search feature, which debuted in January 2024 and is available on select Android devices. That feature allows you to search what's on the screen within any app, not just the Chrome or Google apps. So iPhone users are getting a taste of the Android life, but only if they choose to ditch Safari and use the Chrome or Google app. Thanks to Google's "advanced AI models," Lens can provide more information about more images, even obscure ones, than ever. It surfaces the information in the same format as the AI Overviews that have lived at the top of the typical Google search results page since last May. Like AI Overviews, it's only available for English-language users and in eligible countries. "Perhaps you come across an interesting-looking car and want to learn more about the strange texture on its hood," Google says. "Just tap the camera icon in the Search bar to snap a photo with Lens. You'll get an AI Overview to help you quickly make sense of what you're looking at, along with links to helpful resources on the web." The latest image search capabilities from Apple came with December's iOS 18.2 update. Now, iPhone 16 owners can press the Camera Control button, snap a picture of an object, and look up information about it with Apple Intelligence.
[12]
Google rolls out visual search on iPhones, but curiously ditches 'Circle to Search' branding
Unlike the Android version, where users can long-press the navigation bar or home button to activate Circle to Search, iPhone users will be able to use Google Lens to search what's on their screen -- right within the Google app or Chrome. The feature allows iPhone users to select and search specific content from their screen using simple gestures such as tapping, highlighting, or drawing. Google avoids using the term "circle" in its press release, probably because Apple doesn't want to directly associate its devices with an AI feature that is so closely linked to its competitors' products. It could also be the other way around and Google possibly wants to keep the Circle to Search branding limited to the Android ecosystem. Then again, the limited scope of the feature on iOS could also be why it doesn't get the famous moniker.
[13]
Google Lens powering new 'Screen Search' in Chrome for iOS
Google is more deeply integrating Lens visual search into Chrome for iOS with a new "Screen Search" capability that takes after Circle to Search on Android. Meanwhile, Google Lens can now recognize more unique images. Chrome on the iPhone and iPad already has an instance of Lens that lets you take a picture or analyze a screenshot. You can now "Search Screen with Google Lens" as you're browsing the web or watching a video. Instead of manually taking a screenshot, you'll find that new shortcut in the three-dot overflow menu. A shimmer will appear over the screen and you'll be prompted to "Circle or tap anywhere to search." (You can also highlight text.) Web results, including AI Overviews, appear in a bottom sheet, with pages opening a new tab. This is essentially Android's Circle to Search capability, and also coming to the Google app on iOS as "Search this Screen" in the three-dot menu (top-right corner). Google Lens Screen Search is rolling out now with Chrome 133 for iOS and the latest version of the Google (Search) app. Meanwhile, Google credits "advanced AI models" as allowing Lens to "go much further and provide information on the contents of more novel or unique images." Results will take the form of AI Overviews, with the expanded recognition meaning that users don't need to manually append a question. Perhaps you come across an interesting-looking car and want to learn more about the strange texture on its hood. Just tap the camera icon in the Search bar to snap a photo with Lens. You'll get an AI Overview to help you quickly make sense of what you're looking at, along with links to helpful resources on the web. Expanded recognition is rolling out this week for "English-language users in countries where AI Overviews are available." You'll first see this with the Google app on Android and iOS before coming to the desktop and mobile Chrome version.
[14]
Google Chrome for iOS Will Now Let You Search the Screen via Lens
Google Chrome and the Google app for iOS are getting a new visual lookup feature. Announced on Wednesday, the new feature uses Google Lens to let users run visual queries on whatever is visible on the screen. The new search feature works similarly to Circle to Search which uses artificial intelligence (AI) to let users perform several actions such as looking up elements on the screen, translating text, and even identifying a song playing on the device. The Mountain View-based tech giant highlighted that the new feature will also support AI Overviews. In a blog post, the tech giant claimed that Google Lens is used for more than 20 billion visual searches every month. Now the company is expanding its functionality and offering iPhone users new ways to use the tool. Google Chrome for iOS is now being upgraded with the Google Lens-powered visual search feature. This is being called "Search Screen with Google Lens". The feature works across the app and supports all web pages. The tech giant said users can draw, highlight, or tap on an object to instantly run a visual lookup. This will eliminate the need to screenshot pages and open the Lens app separately to search. To use the feature, users will need to tap the three-dot menu at the bottom right corner of the browser. There, they will be able to see the new option in the listed items. By selecting "Search Screen with Google Lens", users can highlight any part of the screen and a bottom sheet of Google Search will pop up showing the information. The same feature is also being added to the Google app for iOS. After entering a web page via Search, users can tap the three-dot menu on the top right to find the "Search this Screen" option and quickly run a visual search. Notably, the company highlighted that AI Overviews will be automatically activated whenever a relevant query is made via Lens. This will allow users to quickly find the information without the need to scroll or tap on a link.
[15]
Google rolls out 'Screen Search' in Chrome and Google apps on iOS
Google has announced that it's bringing a new feature to iPhone that enables users to quickly search while browsing the web or watching videos. Screen Search is, effectively, the iOS version of Circle to Search, which was released to select premium Android smartphones over a year ago. The feature is designed to give users a much more convenient way to search content they're actively engaging with. Instead of switching screens to open up a search engine, users can simply circle the thing they're interested in to launch a Google search. The feature is available under the three-dot menu at the bottom left of the screen. Tapping "Search Screen with Google Lens" or "Search this Screen" will let you circle, tap, or scribble on whatever you're interested in. The feature is set to roll out this week and will be available for both the Chrome app and Google app on the iPhone. While Google doesn't say what iPhones will support the feature, currently the Google app requires iOS 15, while the Chrome app requires iOS 16. Google also notes it's expanding AI Overviews to more Lens search results. Users can now tap the camera button in the search bar to take a photo with Lens. From there, an AI Overview will pop up, giving you a summary of whatever you're looking at. It will also include links to helpful resources across the web. The AI Overview expansion is set to roll out this week for English-language users in countries where AI Overviews are available. It will first roll out for the Google app on iOS and Android, and later expand to Chrome on desktop and mobile devices.
[16]
Google Lens just got smarter -- new iPhone feature lets you search instantly
Google is rolling out two major updates to Google Lens that will make searching what you see easier than ever - especially for iPhone users. The new features, available in both the Chrome app and the Google app for iOS, allow users to search their screens instantly with just a simple gesture. Additionally, AI-powered search results are expanding to make Google Lens even more intuitive and capable of analyzing unique or novel images. The instant screen search update is rolling out globally on iOS this week. If you use Chrome or the Google app on your iPhone, you can now quickly search for information without taking a screenshot or opening a new tab. Instead of manually copying and pasting text or performing a separate search, Google Lens enables you to highlight, draw over, or tap on any part of your screen to initiate a search instantly. In Chrome, open the three-dot menu and select "Search Screen with Google Lens." In the Google app, tap the three-dot menu and select "Search this Screen." This streamlined search process allows users to look up unfamiliar words, products, places, or even video content in seconds. Now, whether you're shopping, reading an article, or watching a video, you can retrieve relevant information with minimal effort. Google also announced that a Lens icon will be added to Chrome's address bar in the coming months, similar to the Lens integration in Chrome desktop that launched last summer. This will make accessing the tool even more convenient. Google Lens has long been able to identify billions of objects -- plants, products, landmarks -- by matching images with its extensive database. However, with the latest AI advancements, Lens can now analyze and provide deeper context for more unique images that may not have an obvious direct match. For example, if you come across a car with an unusual hood texture and want to learn more, you can tap the camera icon in the Search bar to snap a photo with Lens. Instead of just matching the image to similar ones online, Google will now generate an AI Overview -- a brief explanation of what you're looking at, accompanied by links to relevant resources. This means users no longer have to manually enter a search query to get meaningful insights. This AI-powered enhancement is rolling out to English-language users in countries where AI Overviews are available, starting with the Google app on Android and iOS and expanding soon to Chrome on desktop and mobile. These updates represent a significant step forward in how we interact with visual search. The ability to instantly search your screen on iOS makes Google Lens even more integrated into everyday browsing, eliminating the need for cumbersome workarounds like screenshots. Meanwhile, AI Overviews allow for a deeper understanding of unfamiliar objects, places, and concepts. Want to try it out? Update your Chrome or Google app on iOS and start searching your screen today.
[17]
Google Lens Screen Search Comes to iOS Chrome and Google App
Google is rolling out new tricks for the Lens visual search feature within its Chrome and Google apps on iOS, allowing users to search for content directly from their screen without taking screenshots or opening new tabs. In Chrome for iOS, users can access the new Lens functionality through the three-dot menu by selecting "Search Screen with Google Lens." Google says it plans to streamline this process in the coming months by adding a dedicated Lens icon to the browser's address bar, similar to the desktop Chrome implementation launched last year. The Google app for iOS is getting similar functionality. Users will be able to initiate screen searches by tapping the three-dot menu and selecting "Search this Screen." The feature supports various interaction methods, including drawing, highlighting, or tapping to select content for visual searches while browsing articles, shopping, or watching videos. Google is also expanding its AI-powered search capabilities within Lens. The company says it is introducing AI Overviews, which will appear more frequently in Lens results without requiring users to formulate specific questions. As a result, Lens should be able to provide detailed information about novel or unique images beyond its existing object recognition capabilities. The screen search features are rolling out globally this week for both Chrome and the Google app on iOS. The AI Overviews expansion will initially be available to English-language users in supported regions through the Google app for Android and iOS, with Chrome desktop and mobile support coming soon.
[18]
Google expands Lens integration with AI Overviews across apps and devices
Google has introduced two updates for Google Lens, simplifying visual searches across devices and apps, as the company stated it's now "even easier to search what you see." For iPhone users, Google added a new Lens feature in Chrome and the Google app. This allows you to search items on your screen by drawing, highlighting, or tapping. Whether you're reading an article, shopping, or watching a video, you can launch a visual search without taking a screenshot or opening a new tab. In Chrome, access it by tapping the three-dot menu and selecting "Search Screen with Google Lens." In the coming months, a Lens icon will also appear in the address bar, similar to the desktop version Google rolled out last summer. In the Google app, go to the three-dot menu, select "Search this Screen," and choose what to search. Google enhanced Lens with advanced AI models to analyze unique or novel images, going beyond its database of billions of objects like plants, products, or landmarks. Now, Lens delivers detailed information, often including AI Overviews in the results, without requiring a question. For instance, photograph a car's unusual hood texture using the Search bar's camera icon, and Lens provides an AI Overview alongside web links.
Share
Share
Copy Link
Google introduces a new search gesture feature for iPhones, similar to Android's Circle to Search, allowing users to quickly search for on-screen content using Google Lens in the Chrome and Google apps.
Google has announced a significant update to its visual search tool, Google Lens, bringing a feature similar to Android's Circle to Search to iPhone users. This new functionality allows users to quickly search for on-screen content within the Google and Chrome apps, marking a notable advancement in mobile search capabilities 123.
The new feature enables iPhone users to search for any content on their screen by using intuitive gestures:
Google Lens then processes the selection and provides search results or additional information about the selected item.
While this update brings a powerful search tool to iOS users, there are some limitations compared to its Android counterpart:
Google has announced plans to further integrate this feature:
This update represents a significant step in bridging the gap between Android and iOS search capabilities. It offers iPhone users a more intuitive and efficient way to interact with on-screen content, potentially changing how mobile users engage with information and conduct searches 135.
As AI continues to play a larger role in search functionalities, users can expect more advanced features and improved accuracy in visual and contextual searches across platforms 25.
Reference
[1]
[4]
Google introduces new AI-powered text analysis features to Circle to Search, enhancing its capabilities in summarizing, explaining, and extracting text from images.
4 Sources
4 Sources
Google enhances Lens with AI Overviews, making visual searches more informative without follow-up questions. The feature is expanding across Android and iOS platforms, with new shortcuts in Chrome and the Google app.
2 Sources
2 Sources
Google has rolled out significant updates to its Lens app, including voice-activated video search capabilities and improved shopping features, leveraging AI technology to enhance user experience and product information retrieval.
13 Sources
13 Sources
Google rolls out significant updates to its Circle to Search feature, introducing AI Overviews for visual searches and one-tap actions for contact information and URLs, enhancing user experience across Android devices.
8 Sources
8 Sources
Google enhances its Lens visual search tool with multimodal capabilities, including video and voice inputs, while improving shopping features and processing 20 billion visual searches monthly.
2 Sources
2 Sources