Curated by THEOUTPOST
On Thu, 20 Feb, 4:05 PM UTC
12 Sources
[1]
iPhone 15 Pro is getting Apple's AI-powered Visual Intelligence
iPhone 15 Pro owners will be able to use the Action button to get details on items they capture through the camera - once iOS 18.4 expands the skill beyond the iPhone 16. Apple's Visual Intelligence skill is expanding beyond the iPhone 16. The iPhone 15 Pro will soon get custody of the feature, which digs up details on objects you snap through the camera. In a post published on Wednesday, Daring Fireball's John Gruber said that Apple representatives told him that iPhone 15 Pro (and presumably iPhone 15 Pro Max) owners will be able to use Visual Intelligence on their devices. The company wouldn't reveal exactly when the feature would arrive beyond pointing to a "future software update." But Gruber said he believes that it's destined for iOS 18.4, which should soon pop up in beta and is slated to go live in early April. Also: How to use Visual Intelligence on an iPhone 16 to identify unknown objects Available on all current iPhone 16 models and the upcoming iPhone 16e, Visual Intelligence is an AI-based feature designed to help you identify and learn about animals, plants, landmarks, businesses, and a host of other items you view with your phone's camera. Part of Apple Intelligence, the tool is quick and easy to use. Just aim your phone at the item that you want to investigate. Upon long pressing the Camera Control on any iPhone 16 model, you can either run a Google search on the object or ask ChatGPT-specific questions about it. In response, the AI presents you with the requested details. There's only one hiccup. The four existing iPhone 16 models use the physical Camera Control to trigger Visual Intelligence. That button doesn't exist on the iPhone 15 Pro, or the iPhone 16e, for that matter. No problem. Instead, users of the 15 Pro models and the 16e will be able to trigger Visual Intelligence by pressing the Action button. Nestled above the volume controls, this button is customizable, so you can set it to a variety of actions. And there's more, according to Gruber. Apple is also adding a Control Center button that will launch Visual Intelligence. That means you'll be able to activate it just by swiping down on the screen and tapping the appropriate button. That option is headed for the iPhone 15 Pro models and presumably the iPhone 16 series. Also: I bought an iPhone 16 for its AI features, but I haven't used them even once - here's why I welcome the news that Apple is expanding Visual Intelligence. I often use the feature on my iPhone 16 Pro and find it quite helpful. Depending on the object I scan, sometimes the information I get is too generic. But I can ask ChatGPT more than one question to zero in on the details I want. If Visual Intelligence for the iPhone 15 Pro and Control Center are slated for iOS 18.4, then iOS beta users should have a chance to try out these features fairly soon.
[2]
Visual Intelligence coming to iPhone 15 Pro with a future update
Apple Intelligence is available for iPhone 15 Pro and later, but there's an exclusive feature for iPhone 16 models, which is Visual Intelligence. Some believed this was because the iPhone 15 Pro lacks Camera Control, but that all changed with the iPhone 16e - which supports Visual Intelligence. Now Apple has confirmed that the feature will soon be available to iPhone 15 Pro owners too. An Apple spokesperson told Daring Fireball's John Gruber (via MacMagazine) that a future iOS update will enable Visual Intelligence on iPhone 15 Pro models through the Action Button, just like on the recently announced iPhone 16e. There are no details on when exactly this will happen, but presumably the feature will be added with iOS 18.4. For those unfamiliar, Visual Intelligence is a feature that is part of Apple Intelligence and lets people use the camera to learn more about the places and objects around them. The feature can also detect text and provide translations, summaries, or even read them aloud thanks to AI. In addition, Visual Intelligence lets users expand their searches with results from Google or ChatGPT. On iPhone 16 models, Visual Intelligence is activated by pressing and holding Camera Control. For iPhone 16e users, the feature is available through the Action Button or a new Control Center toggle. The same will apply to the iPhone 15 Pro with the promised software update. 9to5Mac had already reported that Apple had been exploring this new way of opening Visual Intelligence since iOS 18.3 beta. Apple is expected to release the first beta of iOS 18.4 in the next few days. In addition to this change, the update will also expand Apple Intelligence support to more languages for the first time. However, according to recent rumors, the company may end up postponing the more advanced Siri with Onscreen Awareness due to engineering challenges. Are you excited to have Visual Intelligence on the iPhone 15 Pro? Let us know in the comments section below.
[3]
iPhone 15 Pro gets a secret AI upgrade you didn't know about
Apple is set to bring Visual Intelligence, a feature similar to Google Lens, to the iPhone 15 Pro through a future software update, as confirmed by Apple representatives to John Gruber of Daring Fireball. Visual Intelligence was first introduced with the iPhone 16 series in September. Initially, it was designed to be accessed via the Camera Control button. However, it was recently announced that the feature would also be available on the iPhone 16E, which lacks this button, and will instead be accessible through its Action Button. This development indicated that the iPhone 15 Pro, which also includes an Action Button, could support Visual Intelligence. Apple has since confirmed that users will be able to launch Visual Intelligence using the Action Button or from the Control Center on the iPhone 15 Pro. While Apple did not specify which software update will include Visual Intelligence for the iPhone 15 Pro, Gruber speculates that it may be part of the forthcoming iOS 18.4, which could begin its developer beta rollout soon and is expected to be publicly available in early April. We compare the iPhone 16e, iPhone 13, iPhone 14 and iPhone SE 3 iPhone 15 Pro owners will benefit from this feature, which is designed to enhance the phone's capabilities by allowing users to analyze real-world objects in real time. Visual Intelligence is part of Apple's suite of AI features and includes persistent shortcuts to tools such as ChatGPT and Google Image Search. For instance, if a user stumbles upon a unique set of towels, they can utilize Visual Intelligence to initiate a Google search for similar products or directly inquire via ChatGPT about where to purchase them. In addition to these capabilities, Visual Intelligence allows users to interact with text by translating, reading aloud, or summarizing it. Users can also obtain information about businesses by pointing their devices at them to view operational hours, menus, services, and purchase options. As such, iPhone 15 Pro and Pro Max users can anticipate experiencing these features in the near future, particularly those willing to participate in beta testing of upcoming software updates.
[4]
This useful Apple Intelligence camera feature is coming to iPhone 15 Pro - here's how it works
iPhone 15 Pro owners will be able to use the Action button to get details on items they capture through the camera - once iOS 18.4 expands the skill beyond the iPhone 16. Apple's Visual Intelligence skill is expanding beyond the iPhone 16. The iPhone 15 Pro will soon get custody of the feature, which digs up details on objects you snap through the camera. In a post published on Wednesday, Daring Fireball's John Gruber said that Apple representatives told him that iPhone 15 Pro (and presumably iPhone 15 Pro Max) owners will be able to use Visual Intelligence on their devices. The company wouldn't reveal exactly when the feature would arrive beyond pointing to a "future software update." But Gruber said he believes it's destined for iOS 18.4, which should soon be available in beta and slated to go live in early April. Also: How to use Visual Intelligence on an iPhone 16 to identify unknown objects Available on all current iPhone 16 models and the upcoming iPhone 16e, Visual Intelligence is an AI-based feature designed to help you identify and learn about animals, plants, landmarks, businesses, and a host of other items you view with your phone's camera. Part of Apple Intelligence, the tool is quick and easy to use. Just aim your phone at the item that you want to investigate. Upon long pressing the Camera Control on any iPhone 16 model, you can either run a Google search on the object or ask ChatGPT-specific questions about it. In response, the AI presents you with the requested details. There's only one hiccup. The four existing iPhone 16 models use the physical Camera Control to trigger Visual Intelligence. That button doesn't exist on the iPhone 15 Pro, or the iPhone 16e, for that matter. No problem. Instead, users of the 15 Pro models and the 16e will be able to trigger Visual Intelligence by pressing the Action button. Nestled above the volume controls, this button is customizable, so you can set it to a variety of actions. And there's more, according to Gruber. Apple is also adding a Control Center button that will launch Visual Intelligence. That means you'll be able to activate it just by swiping down on the screen and tapping the appropriate button. That option is headed for the iPhone 15 Pro models and presumably the iPhone 16 series. Also: I bought an iPhone 16 for its AI features, but I haven't used them even once - here's why I welcome the news that Apple is expanding Visual Intelligence. I often use the feature on my iPhone 16 Pro and find it quite helpful. Depending on the object I scan, sometimes the information I get is too generic. But I can ask ChatGPT more than one question to zero in on the details I want. If Visual Intelligence for the iPhone 15 Pro and Control Center are slated for iOS 18.4, then iOS beta users should have a chance to try out these features fairly soon.
[5]
No Camera Control? No problem, your iPhone 15 Pro will soon get Visual Intelligence
iPhone 16e gets Visual Intelligence! iPhone 15 Pro gets Visual Intelligence! Everybody gets Visual Intelligence! The shocks just keep on coming. After unveiling the iPhone 16e earlier this week, Apple has now revealed that the iPhone 15 Pro is going to get Visual Intelligence despite not having the Camera Control button normally used to activate it. The clues to this surprise, in fact, were there to be spotted at the 16e launch as that phone is in exactly the same boat. It too does not have a Camera Control button (one of many compromises to keep down the price) yet it will support Visual Intelligence out of the box. If the iPhone 16e can offer this feature, it stands to reason that any handset with the specs to run Apple Intelligence can do likewise. The iPhone 15 Pro obviously did not support Visual Intelligence out of the box, since the feature did not exist at that time. But as reported by John Gruber of Daring Fireball, Visual Intelligence will come to the iPhone 15 Pro "in a future software update." That's almost certain to be iOS 18.4, which is slated for an early April release but hasn't begun beta testing yet. Visual Intelligence is one of the marquee features of Apple Intelligence. It's used to deduce information from visual images: you can point your camera at a pair of headphones, for instance, and Visual Intelligence will do its best to tell you what they are and where you can buy them. It can identify plants and animals, summarize text, create Calendar events from flyers, and get information about businesses from a glance at their frontage. You activate it by pressing and holding the Camera Control... at least, that's how you activate it for now. We don't yet know how it will work on the iPhones 16e and 15 Pro. For all the latest information about Apple's AI features and when they will roll out in your country, check out our Apple Intelligence megaguide.
[6]
An iOS update will give iPhone 15 Pro owners Visual Intelligence
Owners of the iPhone 16 and 16 Pro can trigger Visual Intelligence with a long press of their dedicated camera button. But like the recently announced iPhone 16e (which also supports the feature), the iPhone 15 Pro and Pro Max don't have a physical camera button. So, all three phones will have to assign it to the Action button or use a Control Center shortcut, which will arrive in an upcoming software update. Apple hasn't said which iOS version will bring the Apple Intelligence visual search feature to the iPhone 15 Pro series. However, Daring Fireball's John Gruber suspects it will be in iOS 18.4, which could arrive "any day now" for beta testers. Part of the Apple Intelligence suite of AI features, Visual Intelligence lets you point your camera at something and use AI to analyze it in real time. It does a few things on its own, but it gets more useful with info from its persistent onscreen shortcuts to ChatGPT or Google Image Search. So, say you find a set of towels in your closet with a unique pattern. You really like those dang towels and want to buy more, but you can't remember where you got them. Activate Visual Intelligence, choose the Google search shortcut and see if your beloved rags are among the web results that pop up. Alternatively, you could use ChatGPT to ask it about the product and where to order it. Visual Intelligence can also do a few things without the help of Google or OpenAI. You can interact with text: translate, read aloud and summarize. Or learn about a business you point your phone toward: view its hours, menu and services or buy something from it. So, iPhone 15 Pro and Pro Max owners should get a taste of that before long. And perhaps even sooner for those willing to brave the (sometimes rough) waters of beta software.
[7]
The iPhone 15 Pro is getting one of the best iPhone 16 Pro features for free soon
For the most part, owners of the iPhone 15 Pro and iPhone 15 Pro Max have had access to all the same Apple Intelligence features that launched or are rolling out to the iPhone 16 lineup. However, one massive AI tool, Visual Intelligence, has not been available to iPhone 15 Pro models. There was speculation that this was due to the Camera Control button which the iPhone 15 lacks. With the release of the iPhone 16e though, which also doesn't have this button, that's changed and this new budget Apple phone will support Visual Intelligence. According to MacMagazine, an Apple spokesperson told Daring Fireball that an iOS update will enable Visual Intelligence on iPhone 15 Pro phones via the Action button. This is similar to the recently announced 16e which also features the Action button that debuted on the iPhone 15. On the iPhone 16 series, you can open Visual Intelligence by pressing and holding Camera Control. With the 16e, it's available via the Action button or a new Control center toggle. This will also be how it will work on the iPhone 15 Pro and Pro Max. As a reminder, Visual Intelligence is Apple's version of Google Lens, letting you use your iPhone camera to learn about objects and places around you. The feature is supposed to detect text and provide translations and summaries. It can also read text aloud. Additionally, it allows you to search for more information via Google or ChatGPT. With Apple struggling to get Siri 2.0 off the ground and the iOS 18.4 developer beta possibly launching before the end of February, it might mean we won't see this update until iOS 18.5, which may not launch until May. Though, if Apple does have Visual Intelligence ready for the 16e and iPhone 15 Pro models, we could see that materialize in mid-March or early April when iOS 18.4 might release. Beyond Siri, iOS 18.4 should have new emojis and other languages, but without Siri 2.0 it's looking like it might be a slightly less meaty update. Visual Intelligence could beef up that update a bit, at least for iPhone 15 Pro owners.
[8]
The iPhone 15 Pro could get one of my favorite Apple Intelligence features soon - and it's about time
Visual Intelligence allows you to get information on objects you point your camera towards Earlier today I wrote about how the iPhone 16e getting Visual Intelligence was a sucker punch to iPhone 15 Pro owners who still didn't have access to one of Apple Intelligence's best tools. Well, it looks like iPhone 15 Pro owners are in luck, as it has now been confirmed that Visual Intelligence is coming to the 2023 smartphone. An Apple spokesperson told John Gruber from Daring Fireball that "owners of the iPhone 15 Pro will soon be able to bind their Action Button to Visual Intelligence" and that this will come "in a future software update". Not only will iPhone 15 Pro owners get Visual Intelligence via the Action Button but the Apple representatives also confirmed that the "Control Center button to launch Visual Intelligence is also coming to iPhone 15 Pro (and presumably iPhone 16 models, too)." This is huge news for iPhone 15 Pro owners who have been on the fence about whether or not to upgrade to an iPhone 16, 16 Pro, or the new 16e just for Visual Intelligence. Personally, I don't think Visual Intelligence is enough to warrant an upgrade, although it was a major deciding factor for me when I purchased the 16 Pro back in September. Had I known the 15 Pro would get Visual Intelligence, I may not have opted for a new device this year. That said, if you were holding out for an upgrade and are perfectly happy with the iPhone 15 Pro (which you should be, it's an excellent device) then you've now got a major new AI tool coming in a future update. The biggest shift with this news is the fact that the iPhone 15 Pro now matches the latest flagship iPhones in terms of Apple Intelligence functionality. In my testing of Apple AI across multiple devices, I've found the performance to be pretty equal across Mac, iPhone, and iPad. That means I fully expect iPhone 15 Pro users to get as good an experience as iPhone 16 owners when it comes to Visual Intelligence. Visual Intelligence is an interesting Apple Intelligence feature - it allows users to point their camera at an object and get information directly from ChatGPT or Google Search. Visual Intelligence can identify plants, animals, and even add events to your Calendar. I've found Visual Intelligence to be a useful tool thanks to its dedicated launch button in the form of Camera Control, you can quickly pull your iPhone from your pocket and aim at an object. While the iPhone 15 Pro and the iPhone 16e don't benefit from Camera Control, they're still getting an excellent new AI tool that might just come in handy when you least expect it. Apple Intelligence is still in its infancy and while we're still waiting for future updates like Siri with on-screen awareness and personal context (expected in iOS 18.4), there are still plenty of useful AI tools baked into the iPhone. It's worth noting that Apple Intelligence is still in beta according to Apple which means that the software experience is continuously improving with each update. Visual Intelligence has come on leaps and bounds since it first launched as part of iOS 18.2 in December, and now Apple is rolling the feature out to more devices I'm excited to see how it evolves moving forward. If you own an iPhone 15 Pro, there's really no need to upgrade for Apple Intelligence, and that's a great thing. iPhone 15 owners, however, well... your one-year-old iPhone is already left in an AI-less past.
[9]
iPhone 15 Pro Models Could Soon Get Visual Intelligence Features
Visual intelligence can be accessed via Control Center in iPhone 16e iPhone 15 Pro and iPhone 15 Pro Max could soon support visual intelligence, a new leak claimed. Visual intelligence in Apple Intelligence refers to tasks that require computer vision. These are currently only available to the iPhone 16 series, despite the iPhone 15 Pro models supporting on-device artificial intelligence (AI) processing. As per the tipster, the decision was taken after Apple found a solution to integrate these AI features in the recently launched iPhone 16e despite the lack of the Camera Control button. John Gruber, the co-creator of the Markdown markup language, claimed in a blog post that iPhone 15 Pro models could get visual intelligence features as soon as April. Citing unnamed Apple representatives, the tipster claimed that smartphone owners will be able to bind the feature with the Action Button. This is said to be rolled out with the iOS 18.4 update. Notably, visual intelligence allows iPhone 16 series users to long press the Camera Control button to quickly use the camera's viewfinder to look up details about a business, translate text, summarise text or read written text aloud as well as identify plants, animals, and more. So far, the feature was exclusive to the latest generation of the iPhone as the only way to access it was with the Camera Control button. However, with the launch of the iPhone 16e, Apple also added visual intelligence in the Control Center and added the ability to bind the feature via the Action Button. Now, it appears the same will be added to the iPhone 15 Pro models. Notably, visual intelligence was the only Apple Intelligence feature that was not rolled out to last year's Pro models with the iOS 18.2 update. Many speculated that this was being done to create a distinction between the two generations, however, that may not be the case. Gruber speculated that the Cupertino-based tech giant was waiting for the launch of the iPhone 16e before expanding the feature to older devices. Interestingly, Apple has also integrated visual intelligence with OpenAI's ChatGPT, and gives users an option to access the chatbot to ask queries about surrounding objects.
[10]
Finally, My iPhone 15 Pro Is Getting the Visual Intelligence Upgrade It Deserves
When Apple announced the iPhone 16E yesterday, it also confirmed that the new budget phone will get Apple Intelligence's "Visual Intelligence" feature, marking the first time the AI trick will come to a phone without a "Camera Control" button. While the other iPhone 16 series phones use their Camera Control buttons to access Visual Intelligence, the iPhone 16E can instead map it to its Action Button, a simple change that raises the question: why not the iPhone 15 Pro, too? Personally, as an iPhone 15 Pro owner, I've been asking that question for months now, as I've long suspected my phone's internals were definitely capable of it -- it can run every other Apple Intelligence feature without issue. It instead seemed to me like Apple was arbitrarily holding the feature back because it wanted to tie it to a specific button press I didn't have. Well, with the iPhone 16E adopting the Action Button workaround, it seems like Apple's finally listening. Apple representatives have now confirmed that Visual Intelligence will be coming to the iPhone 15 Pro as well, using the same strategy. Speaking to Daring Fireball's Jeff Gruber, an Apple spokesperson said that the iPhone 15 Pro will indeed get Visual Intelligence "in a future software update," and that users could map it to the Action Button. Sweet vindication. There's no word on when exactly that software update will come, and to be honest, I'm not sure if I'll use Visual Intelligence much, but it's encouraging to see my phone's software not get held back by an arbitrary push for hardware cohesion anymore. For the uninitiated, Visual Intelligence brings AI to your iPhone's camera. You can point your camera at a foreign language menu, for instance, to get a translation, or point it at a book to get a summary of what's on the page, or point it at a dog to try to find out what breed it is. It can also surface information about businesses simply by looking at their storefront or signage (in the United States only), and works with Google and ChatGPT for extended search queries. In other words, it's similar to Google Lens, but puts AI first and is built into your operating system. Again, I've been prevented from playing around with it much, but hey, at least I now have the option.
[11]
Apple's Visual Intelligence May Come To iPhone 15 Pro: Here's How It Can Transform Your Camera With AI-Powered Real-Time Search - Apple (NASDAQ:AAPL)
Apple Inc.'s AAPL AI-driven Visual Intelligence feature is reportedly coming to iPhone 15 Pro models in an upcoming iOS update. What Happened: Apple Intelligence, the company's suite of AI-powered features, includes Visual Intelligence, which allows users to analyze objects, translate text, and search for information in real-time using their iPhone's camera. While iPhone 16 and 16 Pro users can trigger the feature using a dedicated camera button, the iPhone 15 Pro and Pro Max and newly launched iPhone 16e lack this hardware. See Also: Apple Stock Has Run Up Over 10% In A Month, Jim Cramer Says iPhone Maker Is 'Ramping' And The Credit Goes To Chinese Subsidies "Apple's solution to enable visual intelligence on the iPhone 16e, despite it lacking the Camera Control button, is two-fold. First, visual intelligence is now available in the Control Center. Second, as demonstrated by Apple in the announcement video, you can assign the Action Button to visual intelligence," said Daring Fireball's John Gruber. Gruber further said that Cupertino will likely assign Visual Intelligence to the Action button or use a Control Center shortcut, citing an Apple representative. Apple hasn't confirmed which iOS version will bring Visual Intelligence to iPhone 15 Pro models, but Gruber speculates it may arrive in iOS 18.4, which could enter beta testing "any day now." Subscribe to the Benzinga Tech Trends newsletter to get all the latest tech developments delivered to your inbox. Why It Matters: Apple's decision to extend Visual Intelligence to iPhone 15 Pro users reduces the pressure to upgrade to the iPhone 16 series for AI-driven camera enhancements. Since its launch in September 2023, the iPhone 15 Pro series has seen significant adoption, with the Pro and Pro Max models accounting for 5.44% and 6.83% of the market by November 2024, according to Statista. Check out more of Benzinga's Consumer Tech coverage by following this link. Read Next: Apple's iPhone 16e To Drive Revenue Growth For Tim Cook's Company, Says Top Analyst Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Photo courtesy: Shutterstock AAPLApple Inc$245.25-0.24%OverviewMarket News and Data brought to you by Benzinga APIs
[12]
iPhone 16e has Apple Intelligence's best AI tool, but you don't use Camera Control to activate it - 15 Pro owners should be mad
Visual Intelligence lets you point your iPhone's camera at an object and get information on what you're looking at, whether that's through ChatGPT or via Google Search. The feature is one of the best use cases for Apple Intelligence and, until now, has been tied to Camera Control on the best iPhones (it's launched with a long press of the Camera Control toggle). On the newly announced iPhone 16e, however, Visual Intelligence is activated differently due to the lack of Camera Control functionality. Instead, you can assign the feature to the Action Button or access it via Control Center. While it's excellent that the 16e has full Apple Intelligence functionality thanks to the A18 chip and 8GB of RAM, missing a dedicated way to launch Visual Intelligence without assigning the feature to the customizable Action Button does slightly limit the user. The Action Button can be used to launch or perform many different shortcuts, and having Visual Intelligence assigned to a different input like Camera Control allows iPhone 16 and iPhone 16 Pro users to get the most from their devices. On the 16e, users will have to choose between Visual Intelligence via the Action Button or one of the other endless shortcuts you can assign to the quick toggle. If you don't want to assign Visual Intelligence to the Action Button on iPhone 16e, you can access the AI tool from Control Center, although that requires extra input such as dragging down from the top of your display. The iPhone 16e is available for preorder starting February 21 and will be available from February 28. Prices start from $599 / £599 / AU$999. While the iPhone 16e has Visual Intelligence functionality without the Camera Control, the iPhone 15 Pro does not, despite having everything else Apple Intelligence offers. Up until the iPhone 16e reveal, it was presumed that Visual Intelligence was not possible on the 15 Pro due to the lack of Camera Control, which was introduced with the iPhone 16 lineup. Instead, it appears that the iPhone 15 Pro, just like iPhone 16e, could access Visual Intelligence in other ways, though Apple, for some reason, has decided against expanding the feature to this particular model - at least for now. As someone who upgraded from the iPhone 15 Pro to the iPhone 16 Pro to try out Visual Intelligence, I'm disappointed that Apple could've in fact added the Google Lens competitor to the older device immediately (or so it seems). Of course, now that the latest member of the iPhone family, the 16e, has Visual Intelligence without a Camera Control toggle, iPhone 15 Pro owners might see the feature come to their devices in the future. I'm not holding my breath, though.
Share
Share
Copy Link
Apple's AI-powered Visual Intelligence feature, previously exclusive to iPhone 16 models, will soon be available on iPhone 15 Pro devices through a future software update, likely iOS 18.4.
In a surprising move, Apple has announced that its AI-powered Visual Intelligence feature will soon be available on iPhone 15 Pro models. This expansion brings advanced object recognition capabilities to a wider range of devices, bridging the gap between older and newer iPhone models in terms of AI functionality 12.
Visual Intelligence is part of Apple's suite of AI features, designed to help users identify and learn about objects, animals, plants, landmarks, and businesses through their device's camera. The feature also offers text detection, translation, and summarization capabilities, powered by artificial intelligence 3.
Users can activate Visual Intelligence by aiming their phone's camera at an object of interest. The AI then analyzes the image and provides detailed information about the subject. Additionally, users can perform Google searches or ask ChatGPT-specific questions about the identified objects, enhancing the depth of information available 14.
While Apple has not specified an exact release date, industry experts, including John Gruber of Daring Fireball, speculate that Visual Intelligence will be added to iPhone 15 Pro models with the iOS 18.4 update. This update is expected to begin beta testing soon and may be publicly available in early April 2025 125.
The expansion of Visual Intelligence to iPhone 15 Pro models demonstrates Apple's commitment to bringing advanced AI features to a broader user base. This move could potentially influence consumer decisions, as it adds significant value to slightly older iPhone models 35.
Apple is also adding a Control Center button to launch Visual Intelligence, making the feature more accessible across supported devices. Furthermore, iOS 18.4 is expected to expand Apple Intelligence support to more languages, although some advanced features like Siri with Onscreen Awareness may face delays due to engineering challenges 24.
As the AI landscape in mobile devices continues to evolve, Apple's expansion of Visual Intelligence to iPhone 15 Pro models represents a significant step in democratizing advanced AI features across its product line. This development not only enhances the capabilities of existing devices but also sets the stage for future innovations in mobile AI technology 35.
Reference
[4]
Apple's latest iOS 18.4 beta brings Visual Intelligence, an AI-powered visual search feature, to iPhone 15 Pro and Pro Max models, expanding its availability beyond the iPhone 16 lineup.
5 Sources
5 Sources
Apple is set to introduce Visual Intelligence, a powerful AI-driven feature for the iPhone 16. This technology aims to revolutionize how users interact with images and the world around them, rivaling Google Lens.
6 Sources
6 Sources
Apple unveils Visual Intelligence, an AI-powered visual search feature for the iPhone 16 camera, rivaling Google Lens. This new technology aims to enhance user experience by providing instant information about objects in photos.
3 Sources
3 Sources
Apple is set to introduce new AI-powered features, collectively known as Apple Intelligence, to the iPhone 16 series. These features are currently in beta testing and are expected to revolutionize user interaction with iOS devices.
4 Sources
4 Sources
Apple is set to introduce its new AI feature, Apple Intelligence, with the iPhone 16 series. This advanced AI system will also be available on some iPhone 15 models, marking a significant step in Apple's AI integration strategy.
7 Sources
7 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved