5 Sources
5 Sources
[1]
The best new Apple Intelligence features from today's updates - 9to5Mac
While this is not the day Apple will release a revamped, LLM-powered Siri, today's updates bring multiple welcome additions to the Apple Intelligence feature set. Here are some highlights. During WWDC25, Apple made sure to (almost passive-aggressively) highlight all the Apple Intelligence features it had already released, as it tried to counter the fact that it is behind on AI. Then, it proceeded to announce new Apple Intelligence features, making sure to showcase only the features it was sure it could deliver right on the first wave of betas, or just after that. Here are some of the coolest Apple Intelligence features you can use as soon as you update your Apple Inteligence-compatible devices starting today. This is by far the most significant change in today's updates. Now, apps can plug directly into Apple's on-device model, which means they can offer AI features without having to rely on external APIs or even the web. For users, this means more private AI-based features, since prompts never leave the device. For developers, a faster and more streamlined experience, since they can leverage the same on-device power Apple itself is leveraging for its own AI offerings. For now, developers won't have access to Apple's cloud-based model. This is on-device only. Still, the fact that any developer will get to adopt AI features with no extra cost should make for extremely interesting features and use cases. Shortcuts and automation play a big part in today's updates, particularly on macOS Tahoe 26. Now, you can bake Apple Intelligence right into your shortcuts, which in practice means including Writing Tools, Image Playground, and even Apple's own models or ChatGPT, as steps in your workflow to get things done faster. If you've never dabbled in automation before, this may feel overwhelming. But I highly recommend you poke around and try to find ways to bring AI-powered automation into your workflow. Depending on your line of work, this may truly change how you get things done. Although the headline feature is Live Translation with AirPods, which automatically translates what a person may be telling you in a different language, Live Translation features also permeate across Messages, Phone, and FaceTime. In Messages, the Live Translation feature automatically translates incoming and outgoing text messages, while during a FaceTime call, it displays live translated captions on screen. When it comes to phone calls, you'll hear spoken translations, much like the Live Translation with AirPods feature. Here's Apple's fine print on which languages are compatible with which features: Live Translation with AirPods works on AirPods 4 with Active Noise Cancellation or AirPods Pro 2 and later with the latest firmware when paired with an Apple Intelligence-enabled iPhone, and supports English (U.S., UK), French (France), German, Portuguese (Brazil), and Spanish (Spain). Live Translation in Messages is available in Chinese (Simplified), English (UK, U.S.), French (France), German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish (Spain) when Apple Intelligence is enabled on a compatible iPhone, iPad, or Mac, as well as on Apple Watch Series 9 and later and Apple Watch Ultra 2 and later when paired with an Apple Intelligence-enabled iPhone. Live Translation in Phone and FaceTime is available for one-on-one calls in English (UK, U.S.), French (France), German, Portuguese (Brazil), and Spanish (Spain) when Apple Intelligence is enabled on a compatible iPhone, iPad, or Mac. And speaking of phone calls, Apple Intelligence now provides voicemail summaries, which will appear inline with your missed calls. Apple has extended its visual search feature beyond the camera, and now it also supports what is on the screen. This means that when you take a screenshot of what is on your screen, iOS will now recognize the content and let you highlight a specific portion, search for similar images, or even ask ChatGPT about what appears in the screenshot. Visual Intelligence also extracts date, time, and location from screenshots, and proactively suggests actions such as adding events to the calendar, provided the image is in English. Now, you can combine two different emoji (and, optionally, a text description) to create a new Genmoji. You can also add expressions, or modify personal attributes such as hairstyle and facial hair for Genmoji based on people from your Photos library. On Image Playground, the biggest news is that you can now use OpenAI's image generator, picking from preset styles, such as Watercolor and Oil, or by sending a text prompt of what you'd like to have created. You can also use Image Playground to create Messages Backgrounds, as well as Genmoji without having to leave the app. It is very obvious that Apple is struggling when it comes to getting a handle on the Siri of it all, when it comes to AI. From multiple high-profile departures, to Tim Cook's pep talk following Apple's financial results, the company is clearly in trouble. That said, some of today's additions to its Apple Intelligence offerings are truly well-thought-out and implemented, and may really help users get things done faster and better. I've been using Apple Intelligence with my shortcuts, and it has really improved parts of my workflow, which is something that until very recently, felt out of reach if I wanted to keep things native. Apple still has a long, long way to go, but the set of AI-powered features it is releasing today is a good one, and it may be worth checking out, even if you've dismissed them in the past.
[2]
Live Translation in Messages is the best AI feature in iOS 26, and Apple isn't even talking about it
When Apple announced the new Apple Intelligence features in iOS 26 earlier this year, there was one in particular that really piqued my interest. You see, I'm Scottish with Italian grandparents, an Italian wife, and I moved to France when I was 10 years old. Combined, this makes for lots of different languages on my iPhone, and a lot of dialect shuffling in my brain to make conversing easy. I speak fluent French, understand most Italian, and can make myself understood through a mix of my upbringing, speaking to my partner regularly in the language, and by adding Os and As to French words. So while I don't necessarily need the AI feature I'm about to talk about, using it for the last week or so has significantly improved my communication experience on iPhone, and I think this might be one of the best iOS 26 features, full stop. If you haven't guessed yet, or aren't up to speed with the new Apple Intelligence features available in iOS 26 (there's a lot, check out this article), I'm of course talking about Live Translation. Now, before you click off from this article because a translation tool sounds boring, hear me out, I swear this is the kind of technology that will transform the way we communicate. Just keep reading. I've been trying Live Translation for the last few months now, and I think it will transform how smartphone users communicate. It's a pretty seamless tool. In iOS 26, you can open up any conversation in Messages and select a primary language to translate from. The language will be downloaded, and then any incoming messages will seamlessly be translated from the language of the message to your preferred one. I tested this with my Italian in-laws as well as my wife, and I was thoroughly impressed with how seamless the translation was. There's no waiting; someone just messages you, and it arrives in two languages, allowing you to understand exactly what is being written and see the original message too. Live Translation doesn't only work in Messages, it also removes language barriers from phone calls and FaceTime. And if you have a pair of AirPods Pro 3 (or Pro 2/AirPods 3 with ANC), the feature is even better. It's the voice communication Live Translation feature that I'm most excited about, as it now allows me to take part in conversations with my Italian family without any struggle. Live Translation translates in real-time as the person on the other side of the phone speaks, and can do the same for them, allowing one of you to speak in English and the other to speak in Italian without any hiccups. Sound cool? This is the kind of built-in feature I've been waiting for for years, and I'm so excited to finally have my hands on it. iOS 26 is now available, and Apple Intelligence-compatible iPhones get access to everything Live Translation has to offer. Live Translation, as well as all the other new Apple Intelligence features, such as Visual Intelligence for screenshots and ChatGPT styles in Image Playground, are some of the best AI features we've seen on iPhone to date, and you don't need an iPhone 17 to use them. In my brief time testing Live Translation, I've been thoroughly impressed with its accuracy and just how quickly it works. I've written conversations in French and English, and received replies in Italian, all absolutely seamlessly. Live Translation might not have as much of an impact on your life if you rarely converse in other languages, but even then, the capability to eliminate language barriers across Apple devices is the first step in making the world more connected, and that's an incredibly exciting and powerful thought.
[3]
iOS 26 is here - 5 of the best Apple Intelligence features to try right now
iOS 26 is finally here, and your iPhone software just got a gorgeous redesign. While the headline feature will be Liquid Glass, installing iOS 26 will also give you access to some new Apple Intelligence features that might improve your life in subtle, yet meaningful, ways. I've whittled down the list of new AI-powered features available in iOS 26 (and iPadOS 26) and selected my five favorites. From in-app translation to new Apple Intelligence abilities in Shortcuts, there are plenty of reasons to install the new update right away, rather than wait for the iPhone 17 to launch later this week. After using iOS 26 for a few months, I've found Live Translation to be my personal favorite new Apple Intelligence feature. Built into Messages, FaceTime, and the Phone app, Live Translation lets you automatically translate messages, add translated live captions to FaceTime, and, on a phone call, the translation will be spoken aloud throughout the conversation, completely removing language barriers using AI. I've written about how Live Translation has drastically improved my ability to communicate with my Italian in-laws, and trust me, if you regularly speak multiple languages, you'll also find this new Apple Intelligence feature to be a game-changer. Apple launched Genmoji and Image Playground as part of the first wave of Apple Intelligence features, and now the company has improved its generative AI image tools. Users can now turn text descriptions into emojis as well as mix emojis and combine them with descriptions to create something new. You can also change expressions and adjust personal attributes of Genmojis made from photos of friends and family members. Image Playground has also been given ChatGPT support to allow users to access brand-new styles such as oil painting and vector art. Apple says, "users are always in control, and nothing is shared with ChatGPT without their permission." I've enjoyed my time using ChatGPT in Image Playground. While it's still not as good as some of the other best AI image generators out there, it improves the Image Playground experience and is a step in the right direction for Apple's creative AI tool. Visual Intelligence might've already been the best Apple Intelligence feature, but now it's even better. In iOS 26, Visual Intelligence can now scan your screen, allowing users to search and take action on anything they're viewing across apps. You can ask ChatGPT questions about content on your screen via Apple Intelligence, and this new feature can be accessed by taking a screenshot. When using the same buttons as a screenshot in iOS 26, you are now asked to save, share the screenshot, or explore more with Visual Intelligence. Visual Intelligence can even pull information from a screenshot and add info to your calendar; it's super useful. If you're like me and take screenshots regularly to remember information, Visual Intelligence on iOS 26 could be the Apple Intelligence feature you've been waiting for. While this entry isn't a feature per se, this iOS 26 addition is a big one for the future of Apple Intelligence: Developers now have access to Apple's Foundation Models. What does that mean exactly? Well, app developers can now "build on Apple Intelligence to bring users new experiences that are intelligent, available when they're offline, and that protect their privacy, using AI inference that is free of cost." Apple showcased an example at WWDC 2025 of an education app using the Apple Intelligence model to generate a quiz from your notes, without any API costs. This framework could completely change the way we, users, interact with our favorite third-party apps, now with the ability to tap into Apple's AI models and make the user experience even more intuitive. At the moment, I've not seen many apps take advantage of this new feature, but give it a few weeks, and we'll be sure to list the best examples of Apple Intelligence in third-party applications. Last but not least, Apple Intelligence is now available in the Shortcuts app. This is a major upgrade to one of the best apps on Apple devices, allowing users to "tap into intelligent actions, a whole new set of shortcuts enabled by Apple Intelligence." I've tried Apple Intelligence-powered shortcuts, and just like the Shortcuts app, the true power here will come down to user creations and how people tap into this new ability. As someone who uses Shortcuts daily, I'm incredibly excited to see how the fantastic community of people who create powerful shortcuts and share them online will tap into Apple Intelligence's capabilities. This is not an AI improvement everyone is going to use, but if you choose to delve into the world of the Shortcuts app and learn how to get the most from it, this new iOS 26 addition might be the best of the lot. iOS 26 is now available as a free update for iPhone, and while only Apple Intelligence-compatible devices (iPhone 15 Pro and newer) have access to the new AI features, there's plenty here to be excited for. Whether you want a fresh coat of paint and a new lock screen redesigned with Liquid Glass, or the idea of Live Translation makes you excited to use your smartphone on vacation, iOS 26 is a huge step forward for iPhone and well worth the upgrade.
[4]
New Apple Intelligence Features Focus on Daily Use Over AI Gimmicks - Phandroid
Apple's taking a measured approach to rolling out new Apple Intelligence features, and the September 15 update shows exactly what that looks like. Instead of dumping a dozen half-baked AI tools on users all at once, Apple's selectively adding Live Translation, Visual Intelligence, and Workout Buddy to the mix. These aren't flashy party tricks but actually designed to solve real problems you might run into during your day. The standout feature has to be Live Translation. Apple's not just throwing translation into one app and calling it a day. This thing works across Messages, FaceTime calls, regular phone calls, and even with your AirPods for in-person conversations. In Messages, it translates as you type. During FaceTime calls, you get live captions while still hearing the original voice. The AirPods integration lets you trigger translation by pressing both stems, asking Siri, or using the Action button. Visual Intelligence lets you search and interact with whatever's on your iPhone screen. See something you want to buy? Press the screenshot buttons, highlight the item, and search across Google, eBay, or Etsy. You can translate text directly from your screen, summarize articles, or add events from flyers to your calendar with a single tap. Workout Buddy analyzes your fitness data and delivers personalized spoken motivation during workouts. It looks at your heart rate, pace, and Activity rings to generate real-time encouragement using voices from Fitness+ trainers. Apple's also opened up its on-device AI models to developers. Apps like Streaks now suggest tasks automatically, while CARROT Weather provides unlimited AI-powered weather conversations. The new Apple Intelligence features are available now with iOS 26 across all Apple devices, with support for nine languages and eight more coming soon.
[5]
These New AI Features Are Coming to Your Updated iPhone, iPad and Mac
With Visual Intelligence, users can search their iPhone screen Apple released its major software update for compatible iPhone, iPad, Apple Watch, Mac devices and Vision Pro on Monday. The new updates come with multiple new features, customisation, and the new Liquid Glass design language. Alongside continuing to push on artificial intelligence (AI), the Cupertino-based tech giant has also added several new Apple Intelligence features and has upgraded a few existing ones. Among them is the new Live Translation feature, which allows two-way translation of conversations in supported languages. All the New Apple Intelligence Features Available to Users On Monday, the tech giant released iOS 26, iPadOS 26, watchOS 26, macOS Tahoe, and visionOS 26 updates for all eligible devices, bringing new features and upgrades. The Apple Intelligence suite received special attention from the company, even as it was not mentioned much during the "Awe Dropping" event. Live Translation, as mentioned above, is now available for users across the Messages, FaceTime, and Phone apps. The feature instantly translates text and voice conversations in supported languages. The feature can also be accessed via AirPods Pro 3 to translate in-person conversations, as long as it is paired with an Apple Intelligence-enabled iPhone model. While using AirPods, users can activate Live Translation by either pressing both earbuds' stems together, saying "Siri, start Live Translation," or even by pressing the Action Button on the paired iPhone. The active noise cancellation (ANC) lowers the volume of the speaker so that the user can easily hear the translation. Apple says the AI feature will support Chinese (Mandarin, simplified and traditional), Italian, Japanese, and Korean by the end of the year. Visual Intelligence is also getting an upgrade. Users can now use the feature to search, take action, and answer questions about the content on their iPhone screen. Apple has integrated with several partners, such as Google, eBay, Poshmark, Etsy, and others, to let users quickly look up information online. Additionally, Visual Intelligence supports ChatGPT, enabling it to answer user queries. A quirky new feature is also coming to Genmoji. Users can now mix emojis and combine them to create a new Genmoji. While making these custom emojis, users can also add a text description to control the output. Apple is now also letting individuals change expressions and modify attributes such as hairstyle when generating Genjmoji or images via Image Playground. Additionally, Image Playground now also supports ChatGPT, and it can be used to explore new art styles such as watercolour and oil painting. Workout Buddy is another new Apple Intelligence experience which analyses users' workout data and fitness history to offer personalised, spoken motivation throughout their session. It is available on the Apple Watch and a paired Bluetooth headphone, with an Apple Intelligence-enabled iPhone nearby. The feature is available in the English language for multiple workout types. Finally, the Shortcuts app is being integrated with Apple Intelligence, allowing users to create actions that use the on-device AI features. Users can summarise text with Writing Tools or create images with Image Playground. Alternatively, individuals can also opt for Private Cloud Compute (PCC)-based AI models to access more advanced shortcuts. "For example, users can create powerful Shortcuts like comparing an audio transcription to typed notes, summarising documents by their contents, extracting information from a PDF and adding key details to a spreadsheet, and more," the company said in a post.
Share
Share
Copy Link
Apple's latest software updates introduce 'Apple Intelligence' features, prioritizing practical AI applications like live translation, visual intelligence, and fitness assistance. These innovations aim to seamlessly integrate AI into daily life while maintaining a strong focus on user privacy.
Apple's latest software updates (iOS 26, iPadOS 26, watchOS 26, macOS Tahoe, visionOS 26) introduce "Apple Intelligence"—a suite of AI features focused on practical, privacy-centric enhancements for daily use
1
2
. This strategy aims to seamlessly integrate advanced AI, boosting user productivity and interaction across the ecosystem3
.Source: TechRadar
Live Translation breaks language barriers in real-time across Messages, FaceTime, and Phone app. It also facilitates in-person conversations via AirPods Pro 3 (using stems, voice, or Action Button) for smoother global communication
2
5
.Visual Intelligence receives a major upgrade, enabling on-screen content search, actions, and questions on iPhone. Integrating with partners like Google, eBay, and Etsy, plus ChatGPT support, it offers deep content understanding and query answering
1
3
.Creative AI tools like Genmoji and Image Playground are enhanced. Users can now combine emojis with text for unique Genmojis, and Image Playground, with ChatGPT integration, introduces new art styles (e.g., watercolor, oil painting) for versatile digital creation
1
4
.Source: NDTV Gadgets 360
Workout Buddy is a new AI-powered personal trainer on Apple Watch that provides personalized, spoken motivation during workouts based on fitness data
5
. It requires Bluetooth headphones and an Apple Intelligence-enabled iPhone, promoting consistency and improved performance4
.Apple is opening its on-device AI models to developers without API costs, fostering innovation while prioritizing user privacy
1
4
. Most AI processing occurs directly on-device, minimizing cloud reliance and ensuring sensitive data security, aligning with Apple's core principles3
.Related Stories
The Shortcuts app now deeply integrates with Apple Intelligence, empowering users to create powerful, AI-driven automations for customized, efficient workflows
5
. These features, available with iOS 26, initially support nine languages, with plans for eight more, underscoring Apple's commitment to practical AI utility for a global audience4
.Summarized by
Navi
[2]
[5]
10 Jun 2025•Technology
27 Dec 2024•Technology
31 Jan 2025•Technology
1
Business and Economy
2
Policy and Regulation
3
Science and Research