6 Sources
[1]
Your entire iPhone screen is now searchable with new Visual Intelligence features
Apple's Visual Intelligence will soon help you search what your camera captures and what is on your screen, including products and events. Apple Intelligence has largely fallen flat among consumers, with delayed deliveries and underwhelming performances; giving Siri context awareness was one of these AI features that Apple failed to deliver. "Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy," said Craig Federighi, Apple's senior vice president of Software Engineering. "Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems." Federighi added that Apple is giving developers access to Apple Intelligence's on-device foundation model. Visual Intelligence already lets you search objects by capturing them with your camera, like you would with Google Lens. With iOS 26, users will be able to search and ask ChatGPT questions about what is displayed on their screen. This makes it easy to look up a pair of shoes you may see in a photo while scrolling Instagram or add events to your calendar by simply using a photo of a poster. When you take a screenshot, the Visual Intelligence capabilities appear. Just like you can edit, mark up, or crop a screenshot, a new option for searching with Visual Intelligence will appear. The AI-powered feature will process what is on your screen to recognize the context and offer options accordingly, including an Add to Calendar function to easily mark your calendar with screenshotted event flyers. Also: 4 best iPadOS 26 features revealed at WWDC - and which iPads will get them Developers will be able to create tools that enable search capabilities using on-screen context awareness using App Intents. Apple also announced other updates to its AI features, including Live Translation on Messages and Phone, integrations with Shortcuts, new languages coming by the end of the year, and improvements to Genmoji and Image Playground.
[2]
My new favorite iOS 26 feature is a supercharged version of Google Lens - and it's easy to use
Apple's Visual Intelligence will soon help you search what your camera captures and what is on your screen, including dates and products. Apple Intelligence has largely fallen flat among consumers, with delayed deliveries and underwhelming performances; giving Siri context awareness was one of these AI features that Apple failed to deliver. However, on Monday, the company announced at WWDC that it is expanding Visual Intelligence to include on-screen awareness, so you can search not only what your iPhone camera captures, but also what is on your screen. Also: Everything announced at Apple's WWDC 2025 keynote: Liquid Glass, MacOS Tahoe, and more "Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy," said Craig Federighi, Apple's senior vice president of Software Engineering. "Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems." Federighi added that Apple is giving developers access to Apple Intelligence's on-device foundation model. Visual Intelligence already lets you search objects by capturing them with your camera, like you would with Google Lens. With iOS 26, users will be able to search and ask ChatGPT questions about what is displayed on their screen. This makes it easy to look up a pair of shoes you may see in a photo while scrolling Instagram or add events to your calendar by simply using a photo of a poster. When you take a screenshot, the Visual Intelligence capabilities appear. Just like you can edit, mark up, or crop a screenshot, a new option for searching with Visual Intelligence will appear. The AI-powered feature will process what is on your screen to recognize the context and offer options accordingly, including an Add to Calendar function to easily mark your calendar with screenshotted event flyers. Also: 4 best iPadOS 26 features revealed at WWDC - and which iPads will get them Developers will be able to create tools that enable search capabilities using on-screen context awareness using App Intents. Apple also announced other updates to its AI features, including Live Translation on Messages and Phone, integrations with Shortcuts, new languages coming by the end of the year, and improvements to Genmoji and Image Playground.
[3]
Apple's Visual Intelligence Is Getting Smarter -- But It's Still Missing the Feature I Really Want
When Apple's senior vice president of software engineering, Craig Federighi, started talking about the Visual Intelligence feature in iOS 26 at WWDC 2025, I hoped for significant changes beyond its existing ability to tell you information about the places and objects you point your camera at on recent iPhones. Instead, we got the somewhat underwhelming news that Visual Intelligence options would soon be available directly in the iOS screenshot interface. I can't deny that these capabilities are practical (if a bit unexciting). But Visual Intelligence still falls short of Google's Gemini Live and Microsoft's Copilot Vision in that it can't converse with you out loud about what you see. This sort of live interactivity isn't necessarily vital, but it does feel exciting and natural to use. The foundation of Visual Intelligence is solid, but I still want Apple to push things forward in a way that aligns with its measured approach to AI. What Does Visual Intelligence Do Well? Like many of iOS's best features, Visual Intelligence is a core part of the OS and works seamlessly with its default apps. That means you don't need to open a separate app and upload an image to have the AI analyze it. And the new ability to access the tool whenever you snap a screenshot certainly extends its usefulness. Related options appear on the screenshot interface along the bottom: Ask, which sends the image out to ChatGPT for analysis, or Search, which keeps scans on-device. With the latter, Visual Intelligence can, for example, look for information about an event and create a calendar entry with all the important details. You can also draw over a part of the image to identify it, such as an article of clothing that catches your eye. Visual Intelligence can recognize it and either search it on Google or take you directly to its product page on a shopping app, such as Etsy. Apple is making an API available to app developers so Visual Intelligence can open dedicated apps when it detects relevant content or products. Falling Short on the Cool Factor All that said, I still feel like Visual Intelligence is missing a level of interactivity I can get with other tools. On either my Android phone or iPhone, I can converse back and forth with Copilot Vision or Gemini Live about what I'm looking at via the camera app. When I pointed my phone's camera out a motel window recently, for example, Gemini Live identified the tree in the courtyard as an olive tree. I could then continue to ask related questions, such as where the tree species was native. This ability to point my camera at something and simply chat with an AI about it feels orders of magnitude cooler than anything Visual Intelligence currently does. And more importantly, it feels like something I expect an AI assistant to be able to do. I understand that Apple is prioritizing on-device AI, which isn't yet capable of such feats, but it seems like it should be able to develop a similar feature given how much emphasis it puts on the Private Cloud Compute tech. We can only hope the company catches up with its competitors before their AI tools take an even greater leap ahead.
[4]
Visual Intelligence in iOS 26 makes screenshots useful, not just saved
In iOS 26, Apple Intelligence will turn screenshots into a powerful tool for shopping, planning, and asking questions. Here's how. Apple is giving iPhone users a smarter way to interact with what they see on screen. With iOS 26, the company is expanding its Visual Intelligence feature to go beyond photos and the camera app. Now, users can press the screenshot buttons to access a new tool that analyzes whatever is currently on their display. The update brings Visual Intelligence into everyday use. After taking a screenshot, users can ask ChatGPT about what they're seeing, search for similar items on sites like Google or Etsy, or pull event details straight into their calendar. For example, if someone screenshots a concert poster or flight confirmation, Apple Intelligence can automatically extract the date, time and location, then offer to create a calendar event. The goal is to make iPhones more helpful in the moment. Instead of copying text or jumping between apps, users can interact with content directly on screen. Apple says the process happens on device for speed and privacy. Visual Intelligence can also help with online shopping. If a user sees a jacket they like on social media, they can screenshot it and get visual matches from retail sites. It's also possible to highlight just part of the image to refine the search. Highlighting gives users more control and context without needing to retype or search manually. ChatGPT integration is built into the experience, letting users ask natural-language questions about what's on screen. The integration includes definitions, background information, or even help understanding forms and documents. The software will be released publicly in the fall of 2025 as a free update for supported devices. Visual Intelligence and other Apple Intelligence features require at least an A17 chip, meaning they are only available on the iPhone 15 Pro, iPhone 15 Pro Max and newer models. A public beta will be available in July through Apple's Beta Software Program. Apple's move to integrate screen-level AI tools is part of a broader push to compete with Android's Pixel and Galaxy phones, which already offer similar on-screen help features. Apple's move here feels overdue but smart. Screenshots have quietly become one of the most common ways people save information, especially when it isn't easily copyable. Until now, iOS treated all those images like any other photo. Giving screenshots a brain and a purpose is the kind of quality-of-life upgrade that makes Apple Intelligence feel useful rather than flashy. Instead of bouncing between apps or relying on clunky share sheets, you just take a screenshot and follow your curiosity. It won't be perfect, especially early on. But this is Apple leaning into the idea that the screen itself is the interface, not the app you're in. That shift might end up being more important than any of the AI buzzwords it's wrapped in.
[5]
A small update to Visual Intelligence in iOS 26 makes a giant difference in usability
Looking up AirPods from an image on AppleInsider, using Visual Intelligence Apple has made the smallest update to Visual Intelligence in iOS 26, and yet the impact of being able to use it on any image is huge, and at least doubles the usefulness of this one feature. Previously on Apple Intelligence.. the iPhone's Visual Intelligence was about the most impressive feature. Just by pointing your iPhone's camera at something and pressing the Camera Control button, you could get AI to tell you all about it. This was an extension to Apple's previous ability with the Photos app to identify, say, plants. Visual Intelligence goes further in that it can also identify famous landmarks, or often just random buildings you've always wondered about. Where Photos could say that give you the plant's Latin name, Visual Intelligence could deduce the ingredients in a complicated coffee order. Then, too, Visual Intelligence can in theory recognize dates on an event poster and put the details into your calendar. In practice it gets thrown by certain wild poster designs, and never point it at a musician's world tour schedule, but it's still impressive when it works. The thing with all of this, though, was that Visual Intelligence involved pointing your iPhone camera at whatever you were interested in. What Apple has done with iOS 26 is take that step away. Everything else is the same, but you no longer have to use your camera. You can instead deploy Visual Intelligence on anything on your iPhone's screen. This one thing means that researchers can find out more about objects they see on websites. And shoppers can freeze frame a YouTube video and use Visual Intelligence to track down the bag that influencer is wearing. There is an issue that this means there are now two different ways to use Visual Intelligence, and they involve you having to do two different things to start them. That's even the case despite how in 2024, Apple removed the requirement to use the Camera Control button. To continue doing what you've been able to since Visual Intelligence was launched, you still have to first point your iPhone at the object you're interested in. But then if you set up these following options, you can: If your iPhone has a Camera Control button, it is already set up for Visual Intelligence and you need do nothing. Whether it has this or not, though, you can use any of the other options. To add Visual Intelligence to the Action Button, go to Settings, Action Button, and swipe to Controls. Or for Control Center, swipe down to open that, then press and hold on a blank spot. Choose Add a Control and the search for Visual Intelligence. Lastly, you can also activate the original Visual Intelligence from the Lock Screen. At that screen, press and hold until you get the option to Customize, then add the Visual Intelligence control. All of this ends up with you being able to use Visual Intelligence the way it was originally intended -- and unfortunately, none of it launches the new version. This is an extra part of Visual Intelligence, not a replacement. So all of the above remains true and also useful, but to get Visual Intelligence to work on anything on screen, you have to do something completely different. When you use your finger to circle, or highlight, an area of the screen, it glows with the new Siri-style animation. You can then tap the Ask button and type a question, or swipe up to see the image search results. Some of those images will be ones for retailers. When they are, you can tap through to buy the item directly. Apple tried to sell this as being simple, because it uses the same action you do to take regular screenshots. If you're not in the habit of taking screencaps of your iPhone, though, it's another set of buttons to learn. So Visual Intelligence is replete with different ways to use it, one of which provides a very different service to the rest. Yet being able to identify just about anything on your screen is a huge boon. And consequently Apple increased the usefulness of Visual Intelligence just by not requiring the step where you point your iPhone camera at anything.
[6]
Apple's 'Visual Intelligence' Can Actually See Your Screen Now
Freelancers cover news, tech, and entertainment for Lifehacker. Last year, Apple introduced Visual Intelligence, which uses your camera to translate text, identify objects, or start searches by pointing your phone at something. But if the thing you want to know about is on your phone instead of in front of it, you've been out of luck. Until today's WWDC announcement, that is. The new Visual Intelligence update will let users take a screenshot of what's on their phone and perform searches on the image's content. If you're looking at an object, the feature can search to find where you might be able to buy it online. You can also pull up a ChatGPT box to ask questions about what you're looking at. Perhaps the most useful, Visual Intelligence can suggest calendar events from images you screenshot. So, you can remember all those music festivals, conventions, or shows at your local bar that you find on Instagram. Frankly, this might be the coolest, most helpful AI feature I've seen on any platform in a while.
Share
Copy Link
Apple's iOS 26 update introduces significant improvements to Visual Intelligence, allowing users to search and interact with content directly on their iPhone screens, beyond just camera captures.
Apple has announced significant improvements to its Visual Intelligence feature as part of the upcoming iOS 26 update. This AI-powered tool, which previously allowed users to search for objects captured by their iPhone's camera, will now extend its capabilities to the entire screen content 12.
The enhanced Visual Intelligence will enable users to search and interact with any content displayed on their iPhone screens. This includes the ability to ask ChatGPT questions about on-screen information, search for products seen in photos, and easily add events to calendars from images or screenshots 14.
Source: Lifehacker
Craig Federighi, Apple's senior vice president of Software Engineering, stated, "Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems" 1.
A key feature of the update is the integration of Visual Intelligence into the screenshot interface. When users take a screenshot, they will now have options to search or ask questions about the captured content 3. The AI-powered feature will process the on-screen information to recognize context and offer relevant options, such as adding events to calendars from screenshotted flyers 12.
Apple is granting developers access to its on-device foundation model, allowing them to create tools that enable search capabilities using on-screen context awareness through App Intents 12. This move aims to expand the ecosystem of AI-powered applications while maintaining Apple's commitment to on-device processing for speed and privacy 4.
While Apple's Visual Intelligence is making strides, some experts note that it still lacks certain features offered by competitors. For instance, unlike Google's Gemini Live and Microsoft's Copilot Vision, Apple's tool cannot yet engage in spoken conversations about what the camera sees 3.
The new Visual Intelligence features will be available as part of iOS 26, set for public release in fall 2025. However, these advanced AI capabilities will require at least an A17 chip, limiting them to iPhone 15 Pro, iPhone 15 Pro Max, and newer models 4.
Source: AppleInsider
This update represents a significant shift in how users interact with their devices. By making screenshots "smart" and actionable, Apple is streamlining the process of saving and using information from various apps and websites 4. The integration of Visual Intelligence into everyday iPhone use could potentially change how users shop, plan events, and seek information, making the screen itself a more interactive interface 45.
Source: ZDNet
Google introduces Search Live, an AI-powered feature enabling back-and-forth voice conversations with its search engine, enhancing user interaction and information retrieval.
15 Sources
Technology
1 day ago
15 Sources
Technology
1 day ago
Microsoft is set to cut thousands of jobs, primarily in sales, as it shifts focus towards AI investments. The tech giant plans to invest $80 billion in AI infrastructure while restructuring its workforce.
13 Sources
Business and Economy
1 day ago
13 Sources
Business and Economy
1 day ago
Apple's senior VP of Hardware Technologies, Johny Srouji, reveals the company's interest in using generative AI to accelerate chip design processes, potentially revolutionizing their approach to custom silicon development.
11 Sources
Technology
18 hrs ago
11 Sources
Technology
18 hrs ago
Midjourney, known for AI image generation, has released its first AI video model, V1, allowing users to create short videos from images. This launch puts Midjourney in competition with other AI video generation tools and raises questions about copyright and pricing.
10 Sources
Technology
1 day ago
10 Sources
Technology
1 day ago
A new study reveals that AI reasoning models produce significantly higher CO₂ emissions compared to concise models when answering questions, highlighting the environmental impact of advanced AI technologies.
8 Sources
Technology
9 hrs ago
8 Sources
Technology
9 hrs ago