3 Sources
3 Sources
[1]
Apple highlights third-party apps using new Foundation Model models in iOS 26 - 9to5Mac
iOS 26 includes multiple new Apple Intelligence features, but one of the biggest changes is that Apple has opened up its AI models to third-party developers. This allows third-party apps to plug directly into Apple's on-device Found Models. In a new press release today, Apple highlights several popular third-party developers who have leveraged the new Foundation Models framework to power new features in their apps. Here's Susan Prescott, Apple's vice president of Worldwide Developer Relations: "We're excited to see developers around the world already bringing privacy-protected intelligence features into their apps. The in-app experiences they're creating are expansive and creative, showing just how much opportunity the Foundation Models framework opens up. From generating journaling prompts that will spark creativity in Stoic, to conversational explanations of scientific terms in CellWalk, it's incredible to see the powerful new capabilities that are already enhancing the apps people use every day." "The Foundation Models framework in iOS 26 has been a game changer for new workflows in Stuff. Running entirely on device, it's powerful, predictable, and remarkably performant. Its simplicity made it possible for me to launch both Listen Mode and Scan Mode together in a single release -- something that would've taken much longer otherwise." Here's Tim Davison, the developer behind CellWalk: "The on-device model has great performance. Our visuals have always been interactive, but with the Foundation Models framework, the text itself comes alive. Scientific data hidden in our app becomes a dynamic system that adapts to each learner, and the reliable structured data produced by the model made integration with our app seamless." Matt Abras, the developer of SmartGym, said the framework has allowed developers to "deliver on-device features that were once impossible."
[2]
AppleInsider.com
Proving Apple Intelligence's worth, third-party developers are now using it to make apps more personal to users, and users more productive. Apple opened up its Apple Intelligence to third-party developers as part of WWDC 2025, and gave them access to what it calls its Foundation Models Framework. It means developers can address the same Apple Intelligence that runs on-device on iPhones, iPads, and Macs. Now that iOS 26, iPadOS 26, and macOS Tahoe have been officially released, developers are updating their apps to take advantage of the AI features provided via Apple Intelligence. "We're excited to see developers around the world already bringing privacy-protected intelligence features into their apps," said Susan Prescott, Apple's vice president of Worldwide Developer Relations in a statement. "From generating journaling prompts that will spark creativity in Stoic, to conversational explanations of scientific terms in CellWalk, it's incredible to see the powerful new capabilities that are already enhancing the apps people use every day." What developers get By using the Foundation Models Framework, developers get an API that means they can pass prompts to Apple Intelligence. It's done privately, and with specific limitations -- but also specific freedoms: * No limit on user requests * No tokens or API keys for the user to install * Access to the same Apple Intelligence on device That last part is significant, because it is both a limitation and a guarantee of privacy. Developers can't use the full Apple Intelligence LLM in the cloud, nor can they use extensions such as directly making requests of ChatGPT. Example features Nonetheless, developers have been implement Apple Intelligence across a wide range of apps. Apple has now picked out more than 20 to champion, ranging from To Do apps to mental health ones. Significant examples include SmartGym, which uses the feature to learn from a user's workouts and recommend changes. "The Foundation Models framework enables us to deliver on-device features that were once impossible," said Matt Abras, SmartGym's CEO. "It's simple to implement, yet incredibly powerful in its capabilities." Similarly, the Stoic mental health app can automatically respond with a compassionate, encouraging message if a user logs a low mood. "What amazed me was how quickly we could build these ideas," said Maciej Lobodzinski, Stoic's founder. "[It] let our small team deliver huge value fast while keeping every user's data private, with all insights and prompts generated without anything ever leaving their device." Education and productivity CellWalk takes users through a 3D journey around molecules. Now it can automatically tailor its explanations to the user's level of knowledge. "Our visuals have always been interactive, but with the Foundation Models framework, the text itself comes alive," said CellWalk's Tim Davison. "Scientific data hidden in our app becomes a dynamic system that adapts to each learner, and the reliable structured data produced by the model made integration with our app seamless." Then Apple says that the task manager Stuff app lets a user write "Call Sophia Friday," and it fills in the right details. That's the kind of natural language processing that has long been in calendar apps, but is now in a basic To Do app. More heavyweight To Do apps such as OmniFocus are adopting Apple Intelligence too. It can generate whole projects and suggested tasks to help a user get started.
[3]
Apple's Foundation Models explained: A new way to build with AI
Small teams can now build AI features quickly without heavy infrastructure.With iOS 26, iPadOS 26, and macOS 26, Apple has quietly pushed a new tool into the hands of developers: the Foundation Models framework. It sits at the center of Apple Intelligence, giving apps access to an on-device large language model that runs privately, offline, and without additional cost. Instead of flashy AI hype, the framework is being tested in more grounded ways. Health and fitness apps are using it to make training feel less mechanical. SmartGym, for instance, turns a user's rough description of a workout into a structured routine, then explains why it suggests adjusting reps or weights. It also generates monthly summaries, personal notes, and even dynamic greetings tied to fitness progress. In journaling, Stoic uses the model to suggest prompts that respond to a user's state of mind. Log poor sleep, and the app responds with a compassionate nudge rather than a generic notification. The prompts are generated entirely on-device, keeping personal entries private. Similarly, Gratitude and Motivation are finding ways to transform journal entries into affirmations or mood-based reminders. Education apps have also jumped in. CellWalk, which visualises cells in 3D, now layers conversational explanations of scientific terms on top of its graphics, tailoring answers to a learner's knowledge level. Grammo builds on-the-fly grammar exercises, while Vocabulary automatically sorts saved words into themes like "verbs" or "anatomy." Creativity and productivity apps are experimenting in different directions. Stuff, a lightweight task manager, can parse natural language like "Do laundry tonight" or "Call Sophia Friday," instantly filing them into the right lists. VLLO, a video editor, analyses scenes to recommend background music and stickers reducing the small frictions that slow down new creators. Apps like Signeasy, Agenda, and OmniFocus are using the framework to generate summaries, surface relevant notes, or propose next steps. What makes this noteworthy isn't just the features, but how quickly smaller teams are building them. The framework is tightly integrated with Swift, designed to slot into existing code, and supports guided generation so responses arrive in predictable formats. That lowers the barrier to experimenting with AI features, without requiring heavy infrastructure or sending sensitive data to the cloud. The Foundation Models framework won't grab headlines the way new hardware does, but it may prove more significant in the long run. By making AI feel like a natural extension of apps rather than a separate destination, Apple is nudging the ecosystem toward intelligence that's subtle, personal, and private the kind that works best when you barely notice it's there.
Share
Share
Copy Link
Apple introduces the Foundation Models framework in iOS 26, enabling third-party developers to leverage on-device AI for enhanced app features. This new tool is transforming various app categories, from fitness and mental health to education and productivity.
Apple has unveiled a groundbreaking development in the world of AI and app creation with the introduction of the Foundation Models framework in iOS 26, iPadOS 26, and macOS Tahoe. This new tool, part of Apple Intelligence, allows third-party developers to leverage on-device AI models, opening up a world of possibilities for enhanced app features while maintaining user privacy
1
.Source: AppleInsider
Susan Prescott, Apple's vice president of Worldwide Developer Relations, expressed excitement about the new framework: "We're excited to see developers around the world already bringing privacy-protected intelligence features into their apps. The in-app experiences they're creating are expansive and creative, showing just how much opportunity the Foundation Models framework opens up"
2
.The Foundation Models framework offers several advantages to developers:
3
.Source: 9to5Mac
The introduction of the Foundation Models framework has already begun to revolutionize various app categories:
SmartGym, a fitness app, now uses AI to analyze user workouts and suggest personalized improvements. Matt Abras, SmartGym's CEO, stated, "The Foundation Models framework enables us to deliver on-device features that were once impossible"
2
.Stoic, a mental health app, leverages the framework to generate compassionate responses to users' mood logs. Maciej Lobodzinski, Stoic's founder, noted, "What amazed me was how quickly we could build these ideas... let our small team deliver huge value fast while keeping every user's data private"
2
.Related Stories
CellWalk, an educational app that provides 3D molecular visualizations, now offers tailored explanations based on users' knowledge levels. Tim Davison, CellWalk's developer, explained, "Our visuals have always been interactive, but with the Foundation Models framework, the text itself comes alive"
1
.Task management apps like Stuff and OmniFocus are implementing natural language processing to interpret user inputs and generate structured tasks. For instance, Stuff can now understand commands like "Call Sophia Friday" and automatically fill in the appropriate details
2
.One of the most significant impacts of the Foundation Models framework is its accessibility to smaller development teams. The framework's design allows for quick implementation of AI features without the need for extensive infrastructure or cloud-based solutions. This democratization of AI capabilities enables even small teams to deliver sophisticated, privacy-focused AI features in their apps
3
.As Apple continues to refine and expand its AI offerings, the Foundation Models framework represents a significant step towards more intelligent, personalized, and privacy-conscious apps in the Apple ecosystem.
Summarized by
Navi
[2]
[3]