Curated by THEOUTPOST
On Tue, 4 Mar, 12:03 AM UTC
5 Sources
[1]
iOS 18.4 adds a crucial Apple Intelligence feature to the iPhone 15 Pro -- and it makes your phone more powerful
Here's how to set up Visual Intelligence on an iPhone 15 Pro -- and what you can do with it I have an iPhone 15 Pro in my possession, which means that I also have the ability to access most Apple Intelligence tools after Apple launched its suite of AI features with the iOS 18.1 release last fall. Most tools, but not all. When Apple launched the iPhone 16 lineup last year, it also announced a new feature called Visual Intelligence. With the Camera Control button on those iPhone 16 models, you could summon Apple's answer to Google Lens and get more information and even a handful of actionable commands based on whatever it was you were pointing your phone's camera at. Though the iPhone 15 Pro and iPhone 15 Pro Max have enough RAM and processing power to run Apple Intelligence features, Visual Intelligence has not been one of them. That means someone's $799 iPhone 16 could do something the phone you paid at least $999 for couldn't And when the $599 iPhone 16e debuted last month, we learned that it, too, could access Apple Intelligence why iPhone 15 Pro and Pro Max owners remained shut out. Why, the very idea! That's changed, though, with the arrival of the second public beta of iOS 18.4. If you're trying out that beta on an iPhone 15 Pro, you've now gained the ability to run Visual Intelligence. And while that's not necessarily a game changing decision, it does give your older iPhone new powers it didn't have previous. And some of those powers are proving to be quite useful. Here's a quick rundown of how iPhone 15 Pro and iPhone 15 Pro Max users can set up their devices to take advantage of Visual Intelligence, along with a reminder of just what you can use that AI-powered feature to do. If you want to use Visual Intelligence on your iPhone 15 Pro, you'll need to find a way to launch the feature since only iPhone 16 models come with a Camera Control button. Fortunately, you've got two other options, thanks to the iOS 18.4 update. iOS 18.4 adds Visual Intelligence as an option for the Action button, so you can use that button on the left side of your iPhone to trigger Visual Intelligence. Here's how. From that point, whenever you press and hold the Action button, it will launch Visual Intelligence. If you don't want to tie up your Action button with Visual Intelligence, you can also use the ability to customize the lock screen shortcuts in iOS 18 to add a Visual Intelligence control. The iOS 18.4 adds a Visual Intelligence control that you can place on the bottom of your lock screen. To add that control, simply edit your screen and select the control you want to customize. (In the screen above, we're putting on it on the bottom left corner.) Select Visual Intelligence from the available options -- you'll find it under Apple Intelligence & Siri controls though you can also use the Search bar at the top of the screen to track down the control. Tap the icon to add it to your lock screen. So your iPhone 15 Pro is now set up to launch Visual Intelligence, either from the Action button or a lock screen shortcut. What can you do with this feature? Essentially, Visual Intelligence turns the iPhone's camera into a search tool. We have step-by-step instructions on how to use Visual Intelligence, but if your experience is like mine, you'll find things very intuitive. Once Visual Intelligence launches, point your camera at the thing you want to look up -- it could be a business's sign, a poster or just about anything. The iOS 18.3 update that arrived last month added the ability to identify plants and animals, for example. The information that appears on your screen varies depending on what you point at. A restaurant facade might produce the hours of the place is open, while you can also collect phone numbers and URLs by capturing them with Visual Intelligence. I captured a poster of upcoming event with my iPhone 15 Pro, and Visual Intelligence gave me the option of creating a calendar event with the date and time already filled in. It would be nice if the location were copied over, too, since that information was also on the poster, but we'll chalk that up to this being early days for Visual Intelligence. Visual Intelligence can also get flummoxed in situations like these. When I tried to add a specific soccer match to my calendar from a schedule listing multiple dates, Visual Intelligence got confused as to which one to pick. (It seems to default to the date at the top of the list.) Having to edit incorrect data defeats most of the purpose of this particular capability, but you'd expect Apple to expand Visual Intelligence's bag of tricks over time. You have two other options for expanding on the info Visual Intelligence gives you. If you've enabled Chat GPT in Apple Intelligence, you can share the information with ChatGPT by selecting the Ask button, or you can tap Search to run a Google search on the image you've collected. Of those two options, ChatGPT seems to be the more fully featured in my experience. When I captured a recipe for a bean dip, ChatGPT initially summarized the article, but by asking a follow-up question, I could get the chatbot to list the ingredients and the steps, which I could then copy and paste into the Notes app on my own. To me, that's a lot more handy than having to pinch and zoom on a photo of a recipe or, worse, transcribing things myself. Google searches of Visual Intelligence image captures can be a lot more hit and miss. A photo of a restaurant marquee near me produced search results with similarly named restaurants, but not the actual restaurant I was in front of. Google Search did a much better job when I took a picture of a book cover via Visual Intelligence, and the subsequent search results produced reviews of the book from various sites. That could really be useful the next time I'm in a book store -- it's a place that sells printed volumes, youngsters, ask your parents -- and want to know if the book I'm thinking of buying is actually as good as its cover. That's been my experience with Visual Intelligence so far, but my colleagues have been using it since it came out last year as everything from a virtual guide in an art museum to a navigation tool for getting out of a corn maze. If you've got an iPhone 15 Pro, you can now try out your own uses for Visual Intelligence.
[2]
iOS 18.4 beta 2 brings this AI feature to older iPhones, expands access on new models - 9to5Mac
Apple has brought its Apple Intelligence feature called Visual Intelligence to the iPhone 15 Pro and iPhone 15 Pro Max. iOS 18.4 beta 2 is the first software update to bring the Google and ChatGPT-connected visual search feature to a model outside of the iPhone 16 lineup. Visual Intelligence was initially only invoked through a long-press of the Camera Control on iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max. However, Apple recently introduced the iPhone 16e without a Camera Control button. For this reason, Apple brought Visual Intelligence to the Action button and Control Center. Starting with iOS 18.4 beta 2, Visual Intelligence works on iPhone 15 Pro and iPhone 15 Pro Max for the first time. These models were the first to include Apple Intelligence support (in beta), but lacked this single AI feature. Apple previously confirmed that Visual Intelligence would come to the iPhone 15 Pro, but the company didn't say which software update would bring support. iOS 18.4 beta 1 lacked support on iPhone 15 Pro. 9to5Mac was first to report that Apple was developing support for invoking Visual Intelligence without Camera Control. iPhone 15 Pro and iPhone 16e aren't the only models to include Visual Intelligence support through the Action button and Control Center. iOS 18.4 beta 2 also includes the ability for iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max to invoke Visual Intelligence as an Action button assigned task or through a Control Center tile. No word yet on support for the iPad, which otherwise supports Apple Intelligence on newer hardware.
[3]
The iPhone 15 Pro will get Visual Intelligence with iOS 18.4
What started as an Apple Intelligence feature exclusive to the Camera Control-endowed iPhone 16 line is coming to older iPhones, and soon. We already knew that the iPhone 15 Pro and Pro Max would get Visual Intelligence at some point in the future, and thanks to 9to5Mac, we now know it's one of several options you can assign to the Action Button in the second iOS 18.4 beta. That likely means the feature could end up in the final release of the update. Visual Intelligence lets you draw on AI models from Google and OpenAI to find information (and websites) about anything you point your iPhone's camera at. You can also use the feature to add information from a flyer to your calendar and oddly, identify dog breeds. Until recently, the feature had to be summoned with a long-press of Camera Control on an iPhone 16, but as of the release of the iPhone 16e, Apple made it possible to use an Action Button to pull it up, too. Considering the iPhone 15 Pro's A17 Pro chip offers enough RAM to enable other Apple Intelligence features, it makes sense that its ACtion Button shouldn't be left out of the fun. iOS 18.4 is currently in beta and is expected to launch in early April. Alongside expanding the number of phones that can run Visual Intelligence, Apple is also using the update to launch a new recipe section in Apple News called Apple News+ Food. Previously, Bloomberg reported that iOS 18.4 was supposed to also mark the launch of Apple's upgraded Siri, which is supposed to have the ability to see and take action inside of apps, but that feature is now coming later.
[4]
Visual Intelligence comes to iPhone 15 Pro and Pro Max
Apple has introduced the Visual Intelligence feature, akin to Google Lens, in the latest iOS 18.4 developer beta for the iPhone 15 Pro and iPhone 15 Pro Max, as reported by 9to5Mac. The feature had been confirmed previously for these devices but without a specified software update timeline. The rollout of iOS 18.4, which is expected in April, will broadly deliver this capability. Initially launched for the iPhone 16 lineup in September and accessed via a Camera Control button, users of the iPhone 15 Pro and Pro Max will now utilize Visual Intelligence through the Action Button or Control Center, similar to the iPhone 16e, which lacks a Camera Control button. According to 9to5Mac, the iOS 18.4 developer beta 2 update also enhances the Action Button and Control Center options for Visual Intelligence across the entire iPhone 16 lineup. This update enables iPhone 15 Pro and iPhone 15 Pro Max users to access the first Google and ChatGPT-connected visual search feature outside of the iPhone 16 series. 9 security flaws fixed in iOS 18.3: Should you trust Apple Intelligence now? Prior to this, Visual Intelligence was only accessible through a long-press of the Camera Control on the iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max. The iPhone 16e's introduction prompted Apple to extend Visual Intelligence to the Action Button and Control Center for convenience. Starting with iOS 18.4 beta 2, Visual Intelligence becomes functional for iPhone 15 Pro and iPhone 15 Pro Max users for the first time. While these models were the first to incorporate Apple Intelligence support in beta, they previously lacked this specific AI feature. Apple confirmed the addition of Visual Intelligence for the iPhone 15 Pro, although no specific software update was cited initially. Notably, iOS 18.4 beta 1 did not include support for the iPhone 15 Pro. 9to5Mac first reported on Apple developing options for invoking Visual Intelligence without the Camera Control button. In addition to iPhone 15 Pro and iPhone 16e, the iOS 18.4 beta 2 allows iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max to utilize Visual Intelligence through assigned Action button tasks or Control Center tiles. There has been no announcement regarding support for iPad models, despite their compatibility with Apple Intelligence on newer devices.
[5]
iPhone 15 Pro users just got a major AI upgrade with Visual Intelligence
Owners of the iPhone 15 Pro and Pro Max can now tap into a helpful AI-powered feature, courtesy of the latest iOS 18.4 developer beta. Launched on Monday, the new beta gives users of these older phones the ability to set up and use Visual Intelligence. Previously accessible only on the iPhone 16, Visual Intelligence lets you run web searches and ask questions about the people, places, and things you view through the camera. Also: Apple Intelligence now needs 7GB of storage, up from 4GB - here's why Beyond supporting the AI-powered feature, the new beta adds a couple of new ways to trigger it. The four iPhone 16 models use the physical Camera Control to launch Visual Intelligence. But that button doesn't exist on the iPhone 15 Pro or Pro Max. Instead, users of the iPhone 15 Pro models and the iPhone 16 can now use the Action button to activate Visual Intelligence. Since the Action button is customizable, you can set it up to perform a variety of different actions. With the new beta, you can also launch Visual Intelligence directly from Control Center. Swiping down from the top of the screen reveals a new Apple Intelligence section with options for activating Siri, using the Type to Siri feature, and triggering Visual Intelligence. Here's how this would work on the latest developer beta. Also: I bought an iPhone 16 for its AI features, but I haven't used them even once - here's why On an iPhone 15 Pro, Pro Max, or any iPhone 16, you'd head to Settings and select the option for Action Button. Swiping through the next screen would show all the potential actions you're able to set. After finding the one for Visual Intelligence, you'd exit the Settings screen. Now let's say you spot an animal, plant, landmark, business, or other item that strikes your curiosity. Aim your phone at the object and hold down the Action button or select Visual Intelligence from Control Center. The next screen gives you two choices. Tap Search to run a Google Search on the object. Otherwise, tap Ask, and you can pose questions about the object that ChatGPT will attempt to answer. Last month, Apple representatives confirmed to Daring Fireball's John Gruber that the iPhone 15 Pro and Pro Max would receive Visual Intelligence. Though the company didn't specify when that would happen, Gruber said he believed it would arrive with iOS 18.4. With this version only on its second developer beta, we have a few more iterations to go until the official release arrives in April.
Share
Share
Copy Link
Apple's latest iOS 18.4 beta brings Visual Intelligence, an AI-powered visual search feature, to iPhone 15 Pro and Pro Max models, expanding its availability beyond the iPhone 16 lineup.
Apple's latest iOS 18.4 beta 2 update brings a significant AI feature, Visual Intelligence, to iPhone 15 Pro and iPhone 15 Pro Max models. This expansion marks a notable shift in Apple's AI strategy, extending advanced capabilities to devices beyond the latest iPhone 16 lineup 12.
Visual Intelligence is Apple's answer to Google Lens, allowing users to gather information and perform actions based on what they point their phone's camera at. This AI-powered feature can identify objects, plants, animals, and even extract information from text in images 13.
With the absence of the Camera Control button on older models, Apple has introduced alternative ways to access Visual Intelligence:
These methods are available not only for iPhone 15 Pro models but also for the entire iPhone 16 lineup, including the recently introduced iPhone 16e 24.
Visual Intelligence offers a range of functionalities:
The feature requires sufficient RAM and processing power, which the A17 Pro chip in the iPhone 15 Pro models can provide 3. While currently in beta, the official release of iOS 18.4 is expected in early April 2025 35.
This update addresses a disparity where newer, less expensive models had features unavailable on older premium devices. It demonstrates Apple's commitment to expanding AI capabilities across its product line, potentially increasing user engagement with AI features 14.
While Visual Intelligence is expanding to more iPhone models, there's no word yet on support for iPad devices 2. The iOS 18.4 update is also expected to introduce other features, such as a new recipe section in Apple News called Apple News+ Food 3.
As Apple continues to develop its AI capabilities, users can expect further enhancements and integrations of Visual Intelligence and other Apple Intelligence features across its ecosystem 45.
Reference
[1]
[2]
[4]
Apple's AI-powered Visual Intelligence feature, previously exclusive to iPhone 16 models, will soon be available on iPhone 15 Pro devices through a future software update, likely iOS 18.4.
12 Sources
12 Sources
Apple is set to introduce Visual Intelligence, a powerful AI-driven feature for the iPhone 16. This technology aims to revolutionize how users interact with images and the world around them, rivaling Google Lens.
6 Sources
6 Sources
Apple unveils Visual Intelligence, an AI-powered visual search feature for the iPhone 16 camera, rivaling Google Lens. This new technology aims to enhance user experience by providing instant information about objects in photos.
3 Sources
3 Sources
Apple introduces AI-powered features called Apple Intelligence to its latest iPhones, including the budget-friendly iPhone 16E, offering enhanced user experiences across various applications.
2 Sources
2 Sources
Apple is set to introduce new AI-powered features, collectively known as Apple Intelligence, to the iPhone 16 series. These features are currently in beta testing and are expected to revolutionize user interaction with iOS devices.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved