4 Sources
4 Sources
[1]
How developers are using Apple's local AI models with iOS 26 | TechCrunch
Earlier this year, Apple introduced its Foundation Models framework during WWDC 2025, which allows developers to use the company's local AI models to power features in their applications. The company touted that with this framework, developers gain access to AI models without worrying about any inference cost. Plus, these local models have capabilities such as guided generation and tool calling built in. As iOS 26 is rolling out to all users, developers have been updating their apps to include features powered by Apple's local AI models. Apple's models are small compared with leading models from OpenAI, Anthropic, Google, or Meta. That is why local-only features largely improve quality of life with these apps rather than introducing major changes to the app's workflow. Below are some of the first apps to tap into Apple's AI framework. The Lil Artist app offers various interactive experiences to help kids learn different skills like creativity, math, and music. Developer Arima Jain and Aman Jain shipped an AI story creator with the iOS 26 update. This allows users to select a character and a theme, with the app generating a story using AI. The developer said that the text generation in the story is powered by the local model. The developer of the Daylish app is working on a prototype for automatically suggesting emojis for timeline events based on the title for the daily planner app. Finance tracking app MoneyCoach has two neat features powered by local models. First, the app shows insights about your spending, such as whether you spent more than average on groceries for that particular week. The other feature automatically suggests categories and subcategories for a spending item for quick entries. The word learning app LookUp has added two new modes using Apple's AI models. There is a new learning mode, which leverages a local model to create examples corresponding to a word. Plus, the example asks users to explain the usage of the word in a sentence. The developer is also using on-device models to generate a map view of a word's origin. Just like a few other apps, the Tasks app implemented a feature to suggest tags for an entry using local models automatically. It's also using these models to detect a recurring task and schedule it accordingly. And the app lets users speak a few things and use the local model to break them down into various tasks without using the internet. Automattic-owned journaling app Day One is using Apple's models to get highlights and suggest titles for your entry. The team has also implemented a feature to generate prompts that nudge you to dive deeper and write more based on what you have already written. Recipe app Crouton is using Apple Intelligence to suggest tags for a recipe and assign names to timers. It also uses AI to break down a block of text into easy-to-follow steps for cooking. Digital signing app Signeasy is using Apple's local models to extract key insights from a contract and give users a summary of the document they are signing. Background sound app Dark Noise has used Apple's local models to let users describe a soundscape in a few words and generate one based on that. You can adjust the levels of different elements of the soundscape once it is created. Lights Out is the new app to track the F1 season and grand prix from Shihab Mehboob, developer of Twitter client Avery and Mastodon client Mammoth. The app uses on-device AI models to summarize commentary during a race. Note-taking app Capture is using local AI to show category suggestions to users as they type in their notes or tasks. Sun and weather-tracking app Lumy now shows neat weather-related suggestions on the app using AI. CardPointers is an app that helps you track your credit card expenses and gives suggestions on the best way to earn points from the cards you have. The app's new version is using AI to let users ask questions about their cards and offers. Guitar Wiz is a guitar learning app using the Apple Foundation Model framework in a few ways. Users will see an explanation of a chord while learning it. The app is showing advanced players' insights based on time intervals. Plus, the AI model is helping the developer support over 15 languages. SmartGym app uses local AI to convert the description of a workout to a step-wise set with rep counts, intervals, and equipment. It also gives users workout summaries with mothly progress, routine breakdowns and individual exercise performance. Journaling app Stoic is using Apple's models to give personalized prompts to users based on their mood logging. The models can also help users summerize posts, search past entries, and organize them. This app helps players of racquet sports such as Tennis and Pickleball to improve their form based on the video recordings. The app makers are now using Foundational models to give actionable and specific feedback. India-based productivity suite company Zoho is using local models to power summarization, translation, and transcription across its apps like Notebook and Tables.
[2]
Apple highlights third-party apps using new Foundation Model models in iOS 26 - 9to5Mac
iOS 26 includes multiple new Apple Intelligence features, but one of the biggest changes is that Apple has opened up its AI models to third-party developers. This allows third-party apps to plug directly into Apple's on-device Found Models. In a new press release today, Apple highlights several popular third-party developers who have leveraged the new Foundation Models framework to power new features in their apps. Here's Susan Prescott, Apple's vice president of Worldwide Developer Relations: "We're excited to see developers around the world already bringing privacy-protected intelligence features into their apps. The in-app experiences they're creating are expansive and creative, showing just how much opportunity the Foundation Models framework opens up. From generating journaling prompts that will spark creativity in Stoic, to conversational explanations of scientific terms in CellWalk, it's incredible to see the powerful new capabilities that are already enhancing the apps people use every day." "The Foundation Models framework in iOS 26 has been a game changer for new workflows in Stuff. Running entirely on device, it's powerful, predictable, and remarkably performant. Its simplicity made it possible for me to launch both Listen Mode and Scan Mode together in a single release -- something that would've taken much longer otherwise." Here's Tim Davison, the developer behind CellWalk: "The on-device model has great performance. Our visuals have always been interactive, but with the Foundation Models framework, the text itself comes alive. Scientific data hidden in our app becomes a dynamic system that adapts to each learner, and the reliable structured data produced by the model made integration with our app seamless." Matt Abras, the developer of SmartGym, said the framework has allowed developers to "deliver on-device features that were once impossible."
[3]
AppleInsider.com
Proving Apple Intelligence's worth, third-party developers are now using it to make apps more personal to users, and users more productive. Apple opened up its Apple Intelligence to third-party developers as part of WWDC 2025, and gave them access to what it calls its Foundation Models Framework. It means developers can address the same Apple Intelligence that runs on-device on iPhones, iPads, and Macs. Now that iOS 26, iPadOS 26, and macOS Tahoe have been officially released, developers are updating their apps to take advantage of the AI features provided via Apple Intelligence. "We're excited to see developers around the world already bringing privacy-protected intelligence features into their apps," said Susan Prescott, Apple's vice president of Worldwide Developer Relations in a statement. "From generating journaling prompts that will spark creativity in Stoic, to conversational explanations of scientific terms in CellWalk, it's incredible to see the powerful new capabilities that are already enhancing the apps people use every day." What developers get By using the Foundation Models Framework, developers get an API that means they can pass prompts to Apple Intelligence. It's done privately, and with specific limitations -- but also specific freedoms: * No limit on user requests * No tokens or API keys for the user to install * Access to the same Apple Intelligence on device That last part is significant, because it is both a limitation and a guarantee of privacy. Developers can't use the full Apple Intelligence LLM in the cloud, nor can they use extensions such as directly making requests of ChatGPT. Example features Nonetheless, developers have been implement Apple Intelligence across a wide range of apps. Apple has now picked out more than 20 to champion, ranging from To Do apps to mental health ones. Significant examples include SmartGym, which uses the feature to learn from a user's workouts and recommend changes. "The Foundation Models framework enables us to deliver on-device features that were once impossible," said Matt Abras, SmartGym's CEO. "It's simple to implement, yet incredibly powerful in its capabilities." Similarly, the Stoic mental health app can automatically respond with a compassionate, encouraging message if a user logs a low mood. "What amazed me was how quickly we could build these ideas," said Maciej Lobodzinski, Stoic's founder. "[It] let our small team deliver huge value fast while keeping every user's data private, with all insights and prompts generated without anything ever leaving their device." Education and productivity CellWalk takes users through a 3D journey around molecules. Now it can automatically tailor its explanations to the user's level of knowledge. "Our visuals have always been interactive, but with the Foundation Models framework, the text itself comes alive," said CellWalk's Tim Davison. "Scientific data hidden in our app becomes a dynamic system that adapts to each learner, and the reliable structured data produced by the model made integration with our app seamless." Then Apple says that the task manager Stuff app lets a user write "Call Sophia Friday," and it fills in the right details. That's the kind of natural language processing that has long been in calendar apps, but is now in a basic To Do app. More heavyweight To Do apps such as OmniFocus are adopting Apple Intelligence too. It can generate whole projects and suggested tasks to help a user get started.
[4]
Apple's Foundation Models explained: A new way to build with AI
Small teams can now build AI features quickly without heavy infrastructure.With iOS 26, iPadOS 26, and macOS 26, Apple has quietly pushed a new tool into the hands of developers: the Foundation Models framework. It sits at the center of Apple Intelligence, giving apps access to an on-device large language model that runs privately, offline, and without additional cost. Instead of flashy AI hype, the framework is being tested in more grounded ways. Health and fitness apps are using it to make training feel less mechanical. SmartGym, for instance, turns a user's rough description of a workout into a structured routine, then explains why it suggests adjusting reps or weights. It also generates monthly summaries, personal notes, and even dynamic greetings tied to fitness progress. In journaling, Stoic uses the model to suggest prompts that respond to a user's state of mind. Log poor sleep, and the app responds with a compassionate nudge rather than a generic notification. The prompts are generated entirely on-device, keeping personal entries private. Similarly, Gratitude and Motivation are finding ways to transform journal entries into affirmations or mood-based reminders. Education apps have also jumped in. CellWalk, which visualises cells in 3D, now layers conversational explanations of scientific terms on top of its graphics, tailoring answers to a learner's knowledge level. Grammo builds on-the-fly grammar exercises, while Vocabulary automatically sorts saved words into themes like "verbs" or "anatomy." Creativity and productivity apps are experimenting in different directions. Stuff, a lightweight task manager, can parse natural language like "Do laundry tonight" or "Call Sophia Friday," instantly filing them into the right lists. VLLO, a video editor, analyses scenes to recommend background music and stickers reducing the small frictions that slow down new creators. Apps like Signeasy, Agenda, and OmniFocus are using the framework to generate summaries, surface relevant notes, or propose next steps. What makes this noteworthy isn't just the features, but how quickly smaller teams are building them. The framework is tightly integrated with Swift, designed to slot into existing code, and supports guided generation so responses arrive in predictable formats. That lowers the barrier to experimenting with AI features, without requiring heavy infrastructure or sending sensitive data to the cloud. The Foundation Models framework won't grab headlines the way new hardware does, but it may prove more significant in the long run. By making AI feel like a natural extension of apps rather than a separate destination, Apple is nudging the ecosystem toward intelligence that's subtle, personal, and private the kind that works best when you barely notice it's there.
Share
Share
Copy Link
Apple's new Foundation Models framework in iOS 26 enables developers to integrate on-device AI capabilities, enhancing app functionality across various categories while maintaining user privacy.
Apple has unveiled its Foundation Models framework as part of iOS 26, iPadOS 26, and macOS Tahoe, marking a significant advancement in on-device AI capabilities for third-party developers . This new framework, introduced during WWDC 2025, allows developers to leverage Apple's local AI models to enhance their applications without incurring inference costs or compromising user privacy .

Source: AppleInsider
The Foundation Models framework offers several advantages for developers:
Susan Prescott, Apple's vice president of Worldwide Developer Relations, emphasized the framework's potential: "We're excited to see developers around the world already bringing privacy-protected intelligence features into their apps" .
Developers are leveraging the Foundation Models framework to enhance various types of apps:
Related Stories

Source: 9to5Mac
Developers have responded positively to the new framework:
The Foundation Models framework is poised to significantly impact the iOS app ecosystem:
As developers continue to explore the possibilities offered by the Foundation Models framework, users can expect to see increasingly intelligent and responsive apps across various categories, all while maintaining the privacy and security that Apple prioritizes.
Summarized by
Navi
[3]
[4]