6 Sources
6 Sources
[1]
8 AI features coming to iOS 26 that actually excite me (and how to try them now)
These features can do everything from mixing music to translating audio for you in near real-time. Apple Worldwide Developers' Conference (WWDC) was expected to have little AI news -- but Apple proved everyone wrong. Even though Apple has not yet launched the highly anticipated Siri upgrade -- the company said we will hear more about it in the coming year -- when it came to event time last week, Apple unveiled a slew of AI features across its devices and operating systems, including iOS, MacOS, WatchOS, and iPadOS. Also: ZDNET's WWDC 2025 recap with Sabrina Ortiz and Jason Hiner While the features weren't the flashiest, many of them address issues that Apple users have long had with their devices or in their everyday workflows, while others are downright fun. I gathered the AI features announced and ranked them according to what I am most excited to use and what people on the web have been buzzing about. Apple introduced Visual Intelligence last year with the launch of the iPhone 16. At the time, it allowed users to take photos of objects around them and then use the iPhone's AI capability to search for them and find more information. Last week, Apple upgraded the experience by adding Visual Intelligence to the iPhone screen. To use it, you just have to take a screenshot. Visual Intelligence can use Apple Intelligence to grab the details from your screenshot and suggest actions, such as adding an event to your calendar. You can also use the "ask button" to ask ChatGPT for help with a particular image. This is useful for tasks in which ChatGPT could provide assistance, such as solving a puzzle. You can also tap on "Search" to look on the web. Also: Your entire iPhone screen is now searchable with new Visual Intelligence features Although Google already offered the same capability years ago with Circle to Search, this is a big win for Apple users, as it is functional and was executed well. It leverages ChatGPT's already capable models rather than trying to build an inferior one itself. Since generative AI exploded in popularity, a useful application that has emerged is real-time translation. Because LLMs have a deep understanding of language and how people speak, they are able to translate speech not just literally but also accurately using additional context. Apple will roll out this powerful capability across its own devices with a new real-time translation feature. Also: Apple Intelligence is getting more languages - and AI-powered translation The feature can translate text in Messages and audio on FaceTime and phone calls. If you are using it for verbal conversations, you just click a button on your call, which alerts the other person that the live translation is about to take place. After a speaker says something, there is a brief pause, and then you get audio feedback with a conversational version of what was said in your language of choice, with a transcript you can follow along. This feature is valuable because it can help people communicate with each other. It is also easy to access because it is being baked into communication platforms people already rely on every day. The AutoMix feature uses AI to add seamless transitions from one song to another, mimicking what a DJ would do with time stretching and beat matching. In the settings app on the beta, Apple says, "Songs transition at the perfect moment, based on analysis of the key and tempo of the music." Also: Your Apple CarPlay is getting a big update: 3 useful features coming with iOS 26 It works in tandem with the Autoplay feature, so if you are playing a song, like a DJ, it can play another song that matches the vibes while seamlessly transitioning to it. Many users are already trying it on their devices in the developer beta and are taking to social media to post the pretty impressive results. Apple made its on-device model available to developers for the first time, and although that may seem like it would only benefit developers, it's a major win for all users. Apple has a robust community of developers who build applications for Apple's operating systems. Tapping into that talent by allowing them to build on Apple Intelligence nearly guarantees that more innovative and useful applications using Apple Intelligence will emerge, which is beneficial to all users since they will be able to take advantage of them. The Shortcuts update was easy to miss during the keynote, but it is one of the best use cases for AI. If you are like me, you typically avoid programming Shortcuts because they seem too complicated to create. This is where Apple Intelligence can help. Also: My favorite iPhone productivity feature just got a major upgrade with iOS 26 (and it's not Siri) With the new intelligent actions features, you can tap into Apple Intelligence models either on-device or in Private Cloud Compute within your Shortcut, unlocking a new, much more advanced set of capabilities. For example, you could set up a Shortcut that takes all the files you add to your homepage and then sorts them into files for you using Apple Intelligence. There is also a gallery feature available to try out some of the features and get inspiration for building. The Hold Assist feature is a prime example of a feature that is not over the top but has the potential to save you a lot of time in your everyday life. The way it works is simple: if you're placed on hold and your phone detects hold music, it will ask if you want your call spot held in line and notify you when it's your turn to speak with someone, alerting the person on the other end of the call that you will be right there. Also: What Apple's controversial research paper really tells us about LLMs Imagine how much time you will get back from calls with customer service. If the feature seems familiar, it's because Google has a similar "Hold for me" feature, which allows users to use Call Assist to wait on hold for you and notify you when they are back. The Apple Vision Pro introduced the idea of enjoying your precious memories in an immersive experience that places you in the scene. However, to take advantage of this feature, you had to take spatial photos and record spatial videos. Now, a similar feature is coming to iOS, allowing users to transform any picture they have into a 3D-like image that separates the foreground and background for a spatial effect. Also: iOS 26 will bring any photo on your iPhone to life with 3D spatial effects The best part is that you can add these photos to your lock screen, and as you move your phone, the 3D element looks like it moves with it. It may seem like there is no AI involved, but according to Craig Federighi, Apple's SVP of software engineering, it can transform your 2D photo into a 3D effect by using "advanced computer vision techniques running on the Neural Engine. " Using AI for working out insights isn't new, as most fitness wearable companies, including Whoop and Oura, have implemented a feature of that sort before. However, Workout Buddy is a unique feature and an application of AI I haven't seen before. Essentially, it uses your fitness data, such as history, paces, Activity Rings, and Training Load, to give you unique feedback as you are working out. Also: Your Apple Watch is getting a major upgrade. Here are the best features in WatchOS 26 Even though this feature is a part of the WatchOS upgrade -- and I don't happen to own one -- it does seem like a fun and original way to use AI. As someone who lacks all desire to work out, I can see that having a motivational reminder can have a positive impact on the longevity of my workout. The list above is already pretty extensive, and yet, Apple unveiled a lot more AI features: Apple released an iOS 26 beta geared toward developers. The beta gives users access to all of the latest features available in iOS 26, but the downside is that since the features are not fully developed, many are buggy. So, if you want to try the features, some things to keep in mind are that you won't get the best experience and, as a result, will want to use either a secondary phone or make sure your data is properly backed up in case something goes wrong. To get started, you need an iPhone 11 or newer running iOS 16.5 or later and an Apple ID used in the Apple Developer Program. ZDNET's guide provides step-by-step instructions on downloading the beta. Get the morning's top stories in your inbox each day with our Tech Today newsletter.
[2]
All The iOS 26 iPhone Features You're Not Getting This Fall
Quick Links iOS 26: Features Not Coming This Fall 3D Lock Screen Effect: iPhone 12 or Newer Vision Pro Unlocking: Face ID Required Features Requiring at Least an iPhone 15 Pro Features With Limited Language Support iOS 26 is scheduled to drop this fall. Aside from the Liquid Glass design overhaul, the software fixes many pain points that have been annoying iPhone owners for many years now. While Apple didn't talk about the features that'll arrive later via point updates (iOS 26.1, and so on), we've managed to identify a few. Also, we've rounded up features limited to specific geographies or languages, and those requiring specific hardware. iOS 26: Features Not Coming This Fall The Siri debacle has reportedly taught Apple a valuable lesson about pre-announcing features well in advance that are not yet in a working state. "Apple, for the most part, will stop announcing features more than a few months before their official launch," Mark Gurman and Drake Bennett reported on Bloomberg in a piece discussing reasons for Apple's AI woes. They were right; this time around, the first developer betas of Apple's "26" operating systems included almost all announced features. There is also no asterisk next to some features in marketing materials, so we don't know whether anything is coming later in 2025 or 2026. However, if Gurman's (usually reliable) reporting is anything to go by, three features planned for iOS 26 may not be ready this fall. An AI-Infused Siri With Personal Context Understanding Gurman's June 12 piece claims Apple plans to unleash an AI-infused version of Siri with personal context understanding and in-app actions as part of an iOS 26.4 update coming in March or April 2026. He added that Apple's leadership team "has set an internal release target of spring 2026" for the belated Siri upgrade. The new features will permit the assistant to tap into personal data and on-device content, as well as perform complex actions by chaining together a series of in-app actions. "A further revamped voice assistant, dubbed LLM Siri internally, is still probably a year or two away -- at minimum -- from being introduced," he cautioned. A spring timeframe was semi-confirmed by software engineering chief Craig Federighi and marketing boss Greg Joswiak, whom Apple dispatched on a post-WWDC25 media tour to salvage Apple's reputation. In a series of interviews, the executives reluctantly confirmed that an AI upgrade for Siri is arriving in 2026 instead of 2025. A Revamped Calendar App Gurman reported ahead of WWDC25 that a redesigned Calendar app was coming to the iPhone, iPad, Mac, and Apple Watch without offering further details. Apple's job posting later revealed that the company is seeking an engineer to "reimagine what a modern calendar can be across Apple's platforms," confirming the app is undergoing a major redesign. The French-language blog MacGeneration pointed to Apple's 2024 acquisition of the Canadian startup Mayday Labs, which makes the AI-powered Mayday calendar app, as an indication that Apple will probably leverage Mayday's technolofy to bolster its own Calendar app with AI smarts. As evidenced by MayDay's video embedded below, the app uses AI to schedule tasks and events at optimal times. Another AI feature, Calendar Shield, automatically blocks your calendar when it becomes overbooked with meetings. Gurman's pre-WWDC25 roundup states that an AI-enhanced Calendar app may not be ready at all until iOS 27. "It originally planned to introduce the software this year, but it's been delayed and is now slated for the subsequent set of operating systems," he said, adding that iOS 27 and macOS 27 are already in development. Aside from the Liquid Glass design overhaul, there are no new features in the Calendar app on iOS 26, iPadOS 26, and macOS Tahoe 26. In fact, there haven't been any major improvements to the Calendar app in quite a few years aside from the ability to create, view, edit, and complete reminders in the Calendar app that was introduced in iOS 18, iPadOS 18, and macOS Sequoia. A Revamped Health App With an AI Doctor Gurman reported in March that Apple was going to overhaul the built-in Health app and bring an AI-powered doctor, adding Apple could bring "smaller changes" to the app this year. However, there are no new features in the Health app aside from Liquid Glass, as the major revamp seems to have been pushed back. "Apple has also been working on an end-to-end revamp of its Health app tied to an AI doctor-based service code-named Mulberry," he reported on the eve of the June 9 WWDC25 keynote. "Neither will be shown at WWDC and, due to delays, likely won't be released at full scale until the end of next year at the earliest, as part of Buttercup." AI-Powered Battery Optimization Gurman said earlier that iOS 26 would utilize Apple Intelligence to optimize battery performance on iPhones, but said the feature may be publicly available when the rumored ultra-thin iPhone 17 model drops later this year. Due to constrained space, the device should be equipped with a smaller battery, and AI could help extend run time. His May report said the feature analyzes user behavior to "make adjustments" to conserve energy. It will also display the time remaining until full charge on the Lock Screen. Apple didn't mention these features, but one of its press images has this battery estimate on the iPhone's Lock Screen. Therefore, it's probably coming via an update to iOS 26 later this year. 3D Lock Screen Effect: iPhone 12 or Newer iOS 26 lets you set a spatial photo as your Lock Screen wallpaper. iOS takes advantage of the underlying depth data to move different segments of an image, such as foreground subjects or objects in the background, using a 3D-like parallax effect as you're tilting your device. Also, the current time grows and shrinks in size to dynamically adapt to the surrounding space. However, this feature won't work on the iPhone 11 family or the second-generation iPhone SE and newer because the 3D effect on the Lock Screen requires at least the iPhone 12. Vision Pro Unlocking: Face ID Required The visionOS 26 operating system lets Vision Pro owners automatically unlock their iPhone (iOS 26 required) while wearing the headset. This only works on Face ID-capable iPhones. Sorry, iPhone SE fans! Features Requiring at Least an iPhone 15 Pro iOS 26 also brings more than half a dozen features that rely on Apple Intelligence, machine learning, and large language models running on-device. Like with the older Apple Intelligence features introduced in the past year, these capabilities require processing capabilities provided by the Apple A17 Pro chip or newer. In other words, the following features in iOS 26 and iPadOS 26 require at least the iPhone 15 Pro or iPhone 15 Pro Max (2024), seventh-generation iPad mini (2024), M1 iPad Air (2022), or M1 iPad Pro (2021). Visual Intelligence on Screenshots Visual Intelligence has gained onscreen awareness, meaning it can analyze anything you see on the iPhone's screen. All you need to do is take a screenshot by simultaneously pressing the volume up button and the side button, and exit the Markup interface by hitting the pen icon at the top to reveal options to reveal the visual intelligence features. Visual Intelligence can be invoked by holding the Camera Control button on the iPhone 16 series (iOS 18.2 or higher required) or on the iPhone 16e running iOS 18.3 or newer. With the iPhone 15 Pro models, visual intelligence can be assigned to the Action Button or added as a Control Center toggle, which requires iOS 18.4 or later. Visual Intelligence may also highlight any objects you can tap. Or, you can draw a selection around anything on the screenshot, similar to Android's Circle to Search feature. iOS 26 bolsters Visual Intelligence with the immediate identification of new types of objects in iOS 26, including art, books, landmarks, natural landmarks, and sculptures, not just animals and plants like before. Visual Intelligence produces relevant results via Google Image search, but Apple said it'll also be able to search using Etsy and other supported apps. You can also upload the screenshot to ChatGPT and ask the chatbot to describe it. And if Visual Intelligence finds event information in the screenshot you took, you'll see an option to add a new calendar entry. Live Translation for Built-In Communication Apps Apple has upgraded translation capabilities across its platforms with live translation across the built-in Phone, Messages, and FaceTime apps on the iPhone, iPad, Mac, and Apple Watch. Live translation uses Apple Intelligence; your device must meet the hardware requirements for Apple Intelligence to hear live translation spoken aloud on phone calls, see automatic translation in the Messages app as you type, and view real-time translation of foreign speakers on FaceTime calls. Real-time translation also works in the Phone, Messages, and FaceTime apps on the Apple Watch Series 9 and newer and Apple Watch Ultra 2 provided the smartwatch is paired with an Apple Intelligence-compatible iPhone. AI Backgrounds and Poll Suggestions in Messages The iPhone's and iPad's built-in Messages app on iOS 26 at long last brings chat backgrounds and polls. You can pick a different wallpaper for each chat by selecting the "Backgrounds" tab on the chat info screen, create a color gradient background, or use a built-in background or your own image from the Photos library. These features don't require any special hardware and are available to all iOS 26 customers. But iOS 26's Messages app also lets you create a new AI chat wallpaper from scratch in Image Playground, which requires Apple Intelligence. In other words, Image Playground integration for AI chat background creation in Messages and getting poll suggestions in group chats only works on the iPhone 15 Pro and Apple silicon iPads. Mixing Emoji Together With Genmoji Genmoji, which lets you create custom emoji images based on people's photos, short textual descriptions, and other parameters, now lets you mix two emoji together and fine-tune them further with descriptions on iOS 26 and iPadOS 26. Plus, you can customize your generated character by changing expressions or adjusting personal attributes such as hairstyle. These new capabilities are only available on devices running Apple Intelligence. New Image Playground Features iOS 26 enhances Image Playground, an Apple Intelligence feature for creating AI images from prompts. You can now access new drawing styles powered by ChatGPT, including oil painting, watercolor effects, vector art, and anime. Like other Image Playground features, these new capabilities only work on the iPhone 15 Pro models and newer. AI Workflows in the Shortcuts App iOS 26 brings a new set of AI-powered actions in the built-in Shortcuts app, as well as dedicated actions for the Apple Intelligence features, such as summarizing text with Writing Tools or creating images with Image Playground. On top of that, you can use responses from ChatGPT or Apple Intelligence as input in your own automated workflows. Apple said a student might want to tap into the Apple Intelligence model in their shortcut that compares an audio transcription of a class lecture to their notes to automatically add any missed key points or takeaways. All AI features in the Shortcuts app are only available on devices capable of running Apple Intelligence. Order Tracking in the Wallet App The iPhone's built-in Wallet app on iOS 26 can track orders by analyzing emails in your inbox sent by merchants or delivery carriers, without requiring their participation. Apple Intelligence provides a summary of your orders along with tracking information, progress notifications, and other useful details, right within the Wallet app. This feature is unavailable on devices incompatible with Apple Intelligence. Reminder Suggestions and Automatic Categorization Apple Intelligence on iOS 26 and iPadOS 26 powers suggestions for new tasks and grocery items in the built-in Reminders app. Apple Intelligence identifies emails you receive, websites you visit, notes you write, text found in Messages, and more. Based on those signals, the feature can automatically categorize task lists into organized sections in the Reminders app, if your device supports Apple Intelligence. Features With Limited Language Support Before we dive into those, remember that you may be able to use features unavailable in your language or country simply by making adjustments in Settings > General > Language and Region. I've long used this trick to reveal the News app which isn't available where I live. To use Apple Intelligence in one of the currently supported languages -- English, French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, or Chinese (Simplified) -- be sure to set both your device and Siri language accordingly. Apple Intelligence will pick up support for more languages, including Vietnamese, over the course of the year. Live Translation in Phone, Messages, and FaceTime Real-time translation of your Messages chats on the iPhone, iPad, Mac, and Apple Watch is restricted to Chinese (Simplified), English (UK, U.S.), French (France), German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish (Spain). Live spoken translation in the Phone and FaceTime apps on the iPhone, iPad, and Mac is limited to one-on-one calls in English (UK, U.S.), French (France), German, Portuguese (Brazil), and Spanish (Spain). Call Screening iOS 26, iPadOS 26, watchOS 26, and macOS Tahoe 26 bring screening tools in the Phone, FaceTime, and Messages apps on the iPhone, iPad, Apple Watch, and Mac to help eliminate distractions. It springs into action when you receive a call from an unknown number in the Phone app, asking the caller for their name and why they're calling. You can see this information on the calling screen as it's gathered, helping you to decide whether this is an important call you should answer or something you can safely ignore. However, call screening only works for calls in these languages and regions: Cantonese (China mainland, Hong Kong, Macao), English (US, UK, Australia, Canada, India, Ireland, New Zealand, Puerto Rico, Singapore, South Africa), French (Canada, France), German (Germany), Japanese (Japan), Korean (Korea), Mandarin Chinese (China mainland, Taiwan, Macao), Portuguese (Brazil), and Spanish (U.S., Mexico, Puerto Rico, Spain). Hold Assist Your iPhone (and your iPad and Mac) can hold the line on your behalf so you can do other tasks, a feature called Hold Assist. It kicks in automatically when you're placed on hold during a phone call, like a session with a live agent on a support call. iOS listens in the background for a live person to become available, then notifies you so you can switch to the live call. Hold Assist is restricted to calls in English (U.S., UK, Australia, Canada, India, Singapore), French (France), Spanish (US, Mexico, Spain), German (Germany), Portuguese (Brazil), Japanese (Japan), and Mandarin Chinese (mainland China). Visited Places in Maps Apple Maps on iOS 26 detects and remembers places you've visited. Locations you've been to, like restaurants and shops, are automatically saved in a new section titled "Visited Places." This feature is launching in beta in the U.S, UK, Australia, Canada, Malaysia, and Switzerland. Lyrics Translation The built-in Music app on iOS 26 brings lyrics translation to help you understand music in a foreign language. Lyric translation is available for "select songs" in the Apple Music catalog in the following language translation pairs: English to Chinese (Simplified), English to Japanese, Korean to Chinese (Simplified), Korean to English, Korean to Japanese, and Spanish to English. Digital ID in Wallet iOS 26 lets you create a digital ID in the Wallet app using your passport that you can use at TSA checkpoints and in apps, but it will initially be available only with U.S. passports. New Genmoji and Image Playground Capabilities Like the existing Genmoji and Image Playground features, mixing two emoji in Genmoji and accessing ChatGPT-powered drawing styles in Image Playground are limited to English, French, German, Italian, Portuguese (Brazil), Spanish, and Japanese. Detecting Calendar Events on Screenshots The new (and previous) visual intelligence features in iOS 26 are available on the iPhone 15 Pro and later models. However, adding an event detected on a screenshot to the built-in Calendar app only works for content in English. Workout Buddy Workout Buddy is a watchOS 26 feature, but we're listing it here as it requires an Apple Intelligence-supported iPhone and will be available "starting in English." As the name suggests, the feature encourages you to achieve your fitness goals with spoken motivation (wireless headphones required) and suggests music in the Workout app. Workout Buddy is available on the Apple Watch Series 6 or newer, the second-generation Apple Watch SE, and all Apple Watch Ultra models. Apple will release iOS 26 and other updates in the fall, probably ahead of new iPhones in September like it usually does, with the Liquid Glass design the most prominent change.
[3]
Many iOS 26 features can't be used without Apple Intelligence -- what you need to know
iOS 26 is going to bring all kinds of fun and useful new abilities to compatible iPhones when it launches this fall, or earlier if you decide to download a beta version. However, be aware that even if your phone can run iOS 26, it may not be able to run all of its features. This is all because of Apple Intelligence's requirements. Several of the headline features of iOS 26 tap into Apple's AI systems to work, meaning you need an iPhone 15 Pro or iPhone 15 Pro Max, or any iPhone 16 model to use them. After going through Apple's iOS 26 info, here's the list of all the features that will not work, either in part or at all, on the iPhone 15, iPhone 15 Plus and all previous iPhone models. The iPhone 16's signature AI feature can now tell users about what's on-screen, rather than just through the camera lens. Visual Intelligence was recently updated to work on the iPhone 15 Pro as well, but it requires a different command to activate it since these phones don't have a Camera Control key. You can still add backgrounds and make polls on any iPhone compatible with iOS 26. But if you want AI-generated backgrounds or some artificial intelligence help with poll options, that will require Apple Intelligence. Audio and text translation for messages and calls, or FaceTime captions, is another Apple Intelligence-powered feature. Users on older iPhones will have to rely on third-party apps to communicate in different languages, or just have to learn enough Spanish to get by on your European vacation. You still can't try out the iPhone's emoji and image-generating tools without Apple Intelligence. That includes the new changes that enable emoji blending, new ChatGPT-style options and more. Perhaps unsurprisingly, using AI features like text summaries, image creation or other Apple Intelligence actions via the Shortcuts app is locked to the iPhone 15 Pro or an iPhone 16. The standard Shortcuts app is available on all iPhones, though, so you can still automate things in the old-fashioned way. Don't forget that if you're a keen user of the iPhone's built-in task manager app, iOS 26's auto-suggested tasks or sub-tasks, and the app's ability to categorize them itself, rely on Apple Intelligence. Perhaps the most surprising member of this list, the built-in order tracker for Apple Wallet needs Apple Intelligence. This is because this system works by taking information from your emails, rather than a formal tracking link. It's a shame that so many of the upgrades Apple highlighted on stage for iOS 26 are locked away from users with older iPhones. But on the bright side, even if you can't use these particular features, you can still try out iOS 26 changes like the new Liquid Glass design, Spatial Scene photos (if you have an iPhone 12 or later), upgraded CarPlay, call screening or the resurrected Games app. Like we mentioned earlier, iOS 26 is in beta at the time of writing, and we can tell you how to download the iOS 26 beta here. A public beta should be available in a few weeks' time, while the stable public version will arrive in fall.
[4]
iOS 26 is bringing some big Apple Intelligence upgrades -- but will they be enough to justify an iPhone upgrade?
Getting an updated operating system can be like getting a brand new phone sometimes, especially if that update brings vital new features to your old device. And certainly, the recently previewed iOS 26 promises big changes, starting with an entirely new look for your iPhone. But as we've previously noted, not every advertised iOS 26 feature is going to reach every iPhone capable of running the software update. Several major additions depend on Apple Intelligence, meaning you'll need an iPhone 15 Pro or later to reap the full benefits of iOS 26. And that means anyone with an iPhone that's been out for two years or more will have to make a decision come the fall: Is it worth upgrading to a new iPhone capable of supporting all that iOS 26 has to offer. It's the same question iPhone owners had to ask themselves a year ago when Apple Intelligence first debuted as part of iOS 18. Certainly, the answer was a bit more clear cut back then. While Apple Intelligence introduced some promising new tools, there was no can't-miss feature, making it easier to hold on to your current iPhone if you weren't fully ready to upgrade. Will iOS 26 yield a different answer? It's hard to say at this point, as the software is only available as a developer beta. The iOS 26 public beta follows in July, and by then, we can get a better sense of what the new features bring to the table, including the ones that require Apple Intelligence support. Still, it doesn't hurt to start thinking about these things now, especially if you're on the fence about upgrading to a new model once the iPhone 17 arrives in a few months' time. With pricing up in the air, you're going to want time to prep -- and potentially save -- to cover the cost of an upgrade, should those iOS 26 changes requiring Apple Intelligence prove to be irresistible. I've been spending a little bit of time with the iOS 26 developer preview, though I've yet to install it on a test device that supports Apple Intelligence. Nevertheless, I have gotten a sense of which Apple Intelligence-powered additions to iOS 26 figure to be the most noteworthy -- and the ones you're going to want to pay attention to as beta testing picks up steam this summer. As a quick reminder, here's a rundown of the iOS 26 features that require Apple Intelligence. To use these capabilities, you're going to want an iPhone 15 Pro, iPhone 15 Pro Max, any iPhone 16 model including the iPhone 16e, or one of the new models Apple introduces later this year. That's just a list of features where Apple Intelligence support is specified. There are other iOS 26 capabilities that may only work on the iPhone 15 or later. For example, iOS 26 Maps is adding a capability where on-device intelligence recognizes the routes you take regularly so that it can alert you to conditions like traffic -- it's unclear if that requires Apple Intelligence support, but it sure sounds like it to me. A quick glance at this list tells me that not all of these features are going to swing the needle toward an iPhone upgrade. If the initial release of Genmoji didn't move you to get a new phone, I'm going to guess expanding the kinds of emoji you can generate with text prompts isn't going to up the ante. Other additions, like incorporating Apple Intelligence into Reminders, sound a bit more promising, but the basic functionality of that app will continue to work just fine on your older iPhone. To my eye, there are two potential Apple Intelligence additions in iOS 26 that could spark serious upgrade talk -- Visual Intelligence and Live Translations. Visual Intelligence is already the best of the Apple Intelligence features in my book. It essentially turns your camera into a search engine of its own, as you can snap photos and then have Apple Intelligence scour the web for more information online. I particularly like that I can point my iPhone's camera at a restaurant's sign and have Visual Intelligence pull up information like the menu. I can also capture info from a flier to auto-populate a calendar entry. The iOS 26 update brings those capabilities to your iPhone's screen. Now you can take a screenshot and perform those same actions you would from capturing an image with your camera, including filling in calendar dates and times for an event mentioned in an email someone sent you. To put it another way, Visual Intelligence is already a very useful feature, and iOS 26 is expanding those uses. Live Translations is a bit more up in the air. Certainly, the concept is promising (and familiar to anyone who's spent time with a recent Pixel phone.) When you place a call to someone who speaks a different language than you do, Apple Intelligence will provide on-the-fly translations so that you can have a relatively seamless conversation. FaceTime conversations should work the same way, and if you text with someone who's using a different language in Messages, you'll get automatic translations of the messages you're sending out and receiving. The value of Live Translations will depend on how much you interact with people who speak a different language than you. Travelers and people who do a lot of cross-border business figure to reap the largest benefit. But whether or not Live Translation is worth the upgrade will depend on just how well the feature performs -- and that's something we'll get a better idea about as more people use the beta. There's one other possibility would-be upgraders need to keep in mind -- that there may be an Apple Intelligence addition in iOS 26 that Apple won't disclose until the iPhone 17 launch in the fall. The aforementioned Visual Intelligence started out life as an iPhone 16 exclusive before a subsequent iOS 18 update added iPhone 15 Pro compatibility. The nice thing about a summer-long beta process is it gives you a chance to see how new features perform and whether or not they get fine-tuned ahead of the arrival of new hardware in the fall. That's especially helpful with something like Apple Intelligence, which continues to be a work in progress. A lot of factors go into deciding whether to upgrade to a new phone or not -- the status of your current model, whether the hardware itself makes compelling improvements and how much that new handset is going to run you. But Apple Intelligence features also figure to play a role in that decision, and the iOS 26 beta will give us some idea of how Apple's AI efforts are progressing.
[5]
Your Older iPhone Might Not Run These New iOS 26 Features
Apple announced a ton of new features and changes across its products during the company's big WWDC 2025 event. iOS 26, in particular, is chock full of updates, with a long list of iPhones that support the newest update. However, some of these new features will only run on newer iPhones. Even if your iPhone can update to iOS 26, you might be missing out on the full experience. Most of the features on this list are powered by Apple Intelligence, Apple's suite of generative AI features. As it happens, Apple Intelligence requires an iPhone 15 Pro or newer to work. That includes, of course, iPhone 15 Pro Max, but also iPhone 16, 16 Plus, 16 Pro, and 16 Pro Max. If you don't have one of those six phones, you can't use any Apple Intelligence features -- both the existing ones released throughout the iOS 18 era, as well as any of the new features in iOS 26. It's a little confusing, since there are a lot of iPhones that can run iOS 26 that aren't these iPhones -- basically iPhone 11 and newer -- but that's the way Apple wants it, unfortunately. Here's what you're missing out on if you don't have a newer iPhone: Apple's Live Translation features for iOS 26, iPadOS 26, and macOS Tahoe seem great. Live Translation uses AI to translate both text and audio in real time across various apps and services. So, you might see translations appear next to messages sent in a language you can't read, or hear translations when on a call with someone who speaks a language you don't understand. Visual Intelligence itself isn't new this year. The Apple Intelligence feature debuted with iOS 18.2, and lets Apple's AI analyze anything in your camera feed. The feature started as an iPhone 16 exclusive, but Apple did bring it to 15 Pro and 15 Pro Max models as well. With iOS 26, Visual Intelligence is now able to analyze the contents of your iPhone's screen as well, via screenshots. In addition to pointing your camera at something for Visual Intelligence to analyze, you can take a screenshot, and the AI will be able to process its contents. It's always a bummer when you spend the day taking photos on your phone, only to notice when you're home that those photos came out smudgy -- all because your camera was dirty. iOS 26 has a (presumably AI-powered) feature to fix that. Apple didn't announce this one, but beta testers discovered that when the camera is dirty, the new Camera app now warns you to clean the lens before taking a photo. Genmoji and Image Playground are getting updates with iOS 26 -- the AI features rolled out last year, allowing you to generate emojis and images with text-based prompts. This year, Apple is adding the ability to mix together different Genmoji and emoji to create brand new icons, which introduces an Emoji Kitchen-like feature to the iPhone. In addition, Image Playground now lets you add ChatGPT styles, like anime, oil painting, and watercolor, to your generations. Shortcuts has been available on all iPhones since iOS 12, but with iOS 26, Apple is integrating AI features to the automation tool. These shortcuts will let you summarize text, generate images, or utilize Apple Intelligence models to power your shortcuts. If your iPhone can't access AI tools at all, you probably won't feel the absence of these upgrades much anyway. Messages is getting some useful new features this year. Two key features are polls and backgrounds: You'll be able to make polls for your friends and family to answer, and choose from a set selection of backgrounds for your chats. Other chat apps have had these features for years, of course, but it's cool they're finally in the Messages app -- and both are compatible with all iPhones running iOS 26. That said, each feature has AI integrations that will only work on iPhone 15 Pro and newer. First, polls: In addition to choosing your own poll questions, newer iPhone owners will also see AI-generated poll suggestions. (I'm not entirely sure why you'd want this, but it's there.) In my opinion, Backgrounds' AI feature is more useful, since you can use Apple Intelligence to generate a background for your chats. I'd prefer to choose my own image to set as a background, but this at least opens up the background possibilities. If you have one of these iPhones compatible with Apple Intelligence, you'll notice iOS 26 suggests new reminders based on the contents of your messages and emails. If you're chatting with your roommate about needing to buy more coffee, for example, you might see a suggested reminder to add it to your grocery list. Plus, iOS 26 will categorize task lists for you using Apple Intelligence. There's a new "Auto-Categorize" setting, complete with an Apple Intelligence logo, if you want to outsource that task to Apple's AI. On iPhone 15 Pro and newer, iOS 26 will use AI to automatically pull data from your emails to show you up-to-date order tracking in Wallet. The app already has non-AI support for order tracking, but it's not particularly used by stores and merchants. iOS 26 has exactly one non-AI feature that isn't supported on all compatible devices: The Lock Screen has a new 3D effect that automatically adjusts the size and position of the time depending on the the photo you choose, prioritizing the subject of the photo. It's neat, and only works on iPhone 12 or newer. If you have an iPhone 11, you're out of luck.
[6]
Your iPhone Might Not Get All of iOS 26's Features: Here's How to Know
These Upcoming iPadOS Features Will Finally Make Your iPad Your Main Device While iOS 26 brings Apple's biggest UI design change in years, it also packs a bunch of new features (as you'd expect). However, not all iOS 26 features are coming to every iPhone, even if they support the new OS. Which iPhones Support iOS 26? As with every iOS release, Apple has dropped support for a few iPhone models in iOS 26. Not all iOS 18-compatible iPhones will support iOS 26. Here's the official list of iPhones getting the iOS 26 update this fall: iPhone 16 series: iPhone 16, plus iPhone 16e/Plus/Pro/Pro Max iPhone 15 series: iPhone 15, plus iPhone 15 Plus/Pro/Pro Max iPhone 14 series: iPhone 14, plus iPhone 14 Plus/Pro/Pro Max iPhone 13 series: iPhone 13, plus iPhone 13 mini/Pro/Pro Max iPhone 12 series: iPhone 12, plus iPhone 12 mini/Pro/Pro Max iPhone 11 series: iPhone 11, plus iPhone 11 Pro/Pro Max iPhone SE (2nd generation and later) If you own any of these iPhone models, you'll be able to install iOS 26 later this year (around September) when Apple rolls it out to the public. Notably, the iPhone X, XS, XR, and XS Max are some of the older models that won't receive the update. Which Features Are Exclusive to Newer iPhones? Just because you get the new OS doesn't mean your phone can support everything that goes with it. While Apple is bringing the new "Liquid Glass" design language to all iPhones, some iOS 26 features (mainly Apple Intelligence-related) will remain exclusive to newer iPhone models. These are some features that will only be available on the iPhone 16 series, iPhone 15 Pro, and iPhone 15 Pro Max. Of course, we can expect the upcoming iPhone 17 line (or whatever Apple names it) to support these too. Visual Intelligence With Search Apple has finally added a Circle to Search-like feature in iOS 26. Now, when you take a screenshot, Visual Intelligence can analyze the screen and offer suggestions. For example, if it detects a date, it may prompt you to add it to your Calendar. You can also search for products from screenshots via services like Etsy and Google, or ask ChatGPT questions about the image directly from the Visual Intelligence screen. All the reasons to love Circle to Search will come into play here, too. Live Translation Similar to Samsung Galaxy smartphones, your iPhone will now offer real-time text and voice translation across apps like FaceTime, Messages, and even phone calls. However, this will only be available on newer iPhone models. AI-Integrated Shortcuts Actions The Shortcuts app is getting a major AI upgrade in iOS 26. You can now create shortcuts that use Apple's AI models to perform AI-related tasks, like summarizing text, generating images, and more. These features will not be available on iPhones without Apple Intelligence support. Advanced Messages Features The Messages app is gaining several updates in iOS 26, including typing indicators in group chats and automatic filtering of texts from unknown senders. In addition, Messages is gaining support for custom chat backgrounds. You can set your own photos or choose from pre-made options, and Apple has also added the ability to create backgrounds using Image Playground, which leverages Apple Intelligence. Messages is also gaining the ability to create polls, making it easier to reach decisions in group chats. Using AI, the app can even suggest when to create a poll. However, both poll suggestions and AI-generated backgrounds will be limited to newer iPhone models. Enhanced Reminders No, the Reminders app is not becoming exclusive to newer iPhones with iOS 26. However, Apple is adding a new feature that suggests reminders based on the contents of your messages and emails. Since this feature relies on Apple Intelligence, it will only be available on the newer iPhone models. Related 7 iOS 26 Features You'll Actually Want to Use The very best of iOS 26. Posts Other Features Several other iOS 26 features will require an iPhone 15 Pro, iPhone 15 Pro Max, or newer. These include automatic order tracking in the Apple Wallet app and the new Genmoji update, which lets you create custom emojis just by typing a description. Why Every iOS 26 Feature Isn't Available on All iPhone Models The reason these features aren't available on older iPhones comes down to hardware limitations. Many of the new iOS 26 and Apple Intelligence features rely on advanced neural processing, large language models, and machine learning capabilities that are only supported by the latest A-series chips (A17 Pro, A18, and A18 Pro). These chipsets include a powerful Neural Engine capable of handling these tasks efficiently, without slowing down overall performance. Since older devices lack the computational power and necessary hardware to run these AI-powered features, Apple has limited them to newer models. Thankfully, Apple will still bring the new design language to all supported iPhones, so even older models can enjoy the fresh UI -- though their iOS 26 experience will be more limited. If you don't care about Apple Intelligence, this might even be a plus.
Share
Share
Copy Link
Apple's iOS 26 update introduces several AI-powered features, but many are exclusive to newer iPhone models with Apple Intelligence capabilities.
Apple's upcoming iOS 26 update is set to introduce a range of new features, many of which are powered by artificial intelligence. However, not all iPhone users will be able to access these new capabilities, as several key features require Apple Intelligence support, which is only available on newer iPhone models
1
2
.The most advanced features of iOS 26 will be exclusive to the iPhone 15 Pro, iPhone 15 Pro Max, and all iPhone 16 models. This limitation is due to the hardware requirements of Apple Intelligence, the company's suite of generative AI features
3
. Users with older iPhones, even those compatible with iOS 26, will miss out on these AI-powered enhancements.Several headline features of iOS 26 depend on Apple Intelligence:
1
4
.Source: Tom's Guide
Live Translations: Real-time translation of text and audio in messages, calls, and FaceTime
3
4
.Genmoji and Image Playground: Enhanced AI-generated emojis and images with new mixing capabilities and style options
3
5
.AI-Powered Shortcuts: New capabilities for summarizing text, generating images, and utilizing Apple Intelligence models in the Shortcuts app
3
5
.Intelligent Reminders: AI-suggested reminders based on message and email content, plus auto-categorization of tasks
5
.While many features require Apple Intelligence, iOS 26 does include updates accessible to a wider range of devices:
2
4
.Source: Tom's Guide
3D Lock Screen Effect: Available on iPhone 12 or newer models
5
.CarPlay Upgrades: Improved car integration features
4
.Call Screening: Enhanced call management capabilities
4
.Related Stories
The exclusivity of certain features to newer iPhone models presents a dilemma for users with older devices. They must decide whether the AI-powered capabilities justify upgrading to a newer iPhone model
4
. This decision may become more pressing as Apple continues to develop and integrate AI features into its ecosystem.Source: The How-To Geek
Apple's approach to AI integration suggests a strategic focus on leveraging advanced hardware capabilities. While this may drive innovation, it also creates a clearer distinction between newer and older iPhone models in terms of feature availability
1
2
4
.As iOS 26 rolls out, first in beta and then to the general public in the fall, users will have the opportunity to evaluate these new features and decide if they warrant an upgrade to a newer iPhone model capable of supporting Apple Intelligence
4
5
.Summarized by
Navi
[2]
[3]
[4]
20 Aug 2025•Technology
17 Jul 2025•Technology
17 Jul 2025•Technology
1
Business and Economy
2
Business and Economy
3
Policy and Regulation