5 Sources
[1]
Apple lets developers tap into its offline AI models | TechCrunch
Apple is launching what it calls the Foundation Models framework, which the company says will let developers tap into its AI models in an offline, on-device fashion. Onstage at WWDC 2025 on Monday, Apple VP of software engineering Craig Federighi said that the Foundation Models framework will let apps use on-device AI models created by Apple to drive experiences. These models ship as a part of Apple Intelligence, Apple's family of models that power a number of iOS features and capabilities. "For example, if you're getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging," Federighi said. "And because it happens using on-device models, this happens without cloud API costs [...] We couldn't be more excited about how developers can build on Apple intelligence to bring you new experiences that are smart, available when you're offline, and that protect your privacy." In a blog post, Apple says that the Foundation Models framework has native support for Swift, Apple's programming language for building apps for its various platforms. The company claims developers can access Apple Intelligence models with as few as three lines of code. Guided generation, tool calling, and more are all built into the Foundation Models framework, according to Apple.
[2]
Apple opens its foundational AI models to developers
It's safe to say Apple Intelligence hasn't landed in the way Apple likely hoped it would. However, that's not stopping the company from continuing to iterate on its suite of AI features. During its WWDC 2025 conference on Monday, Apple announced a collection of new features for Apple Intelligence, starting with the decision to bring its foundational models to developers. According to Craig Federighi, the company's senior vice president of software engineering, Apple's new Foundation Models framework will allow third-party developers to tap into the large language models that power Apple Intelligence.
[3]
Apple Intelligence opened up to all developers with Foundation Models Framework
As rumored, Apple has announced that developers will soon be able to access the on-device large language models that power Apple Intelligence in their own apps through the Foundation Models framework. Through the newly unveiled Foundation Models framework, Apple is giving developers the chance to use native AI capabilities in their own apps. Third-party apps will be able to use the features for image creation, text generation, and more. Like Apple Intelligence itself, the on-device processing will allow for AI features that are fast, powerful, focused on privacy, and available without an internet connection. Rumors that Apple would be opening up its Apple Intelligence platform first circulated earlier this year. In May, Bloomberg reported that Apple would take the first steps toward making its intelligence systems accessible to third-party apps, though it noted that apps wouldn't be able to access the models themselves -- just AI-powered features. Along with opening up Apple Intelligence to other apps, the company also announced that it is expanding the number of languages that its AI platform supports, and making the generative models that power it "more capable and more efficient."
[4]
Apple Opens Its On-Device AI Model to Developers
It's unclear whether Apple is using its older 3B model or an improved AI model for on-device inference. Today, at WWDC 2025, Apple announced the Foundation Models Framework to allow developers to leverage the power of Apple's on-device AI models. Developers can use the new API to integrate AI-powered features into their apps. This new framework utilizes Apple's in-house AI models locally while preserving data privacy. During the announcement, Apple's senior VP of software engineering, Craig Federighi, said: We're also taking the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence, allowing them to tap into intelligence that is powerful, fast, built with privacy, and available even when users are offline. With the new Foundation Models API, developers don't have to rely on third-party vendors like OpenAI and Google to power AI features into their apps. The best part is that AI features will work even in offline mode since Apple's AI models run locally on the device, and there is no AI inference cost for developers. That said, Apple has not demonstrated the capabilities of its in-house models. Last year, Apple showcased its on-device AI model which was trained on 3 billion parameters. The older model was closer to Google's Gemma-1.1-2B and Microsoft's Phi-3-mini models in terms of performance. It's unclear whether Apple is using the same AI model or if the company has trained an improved model for the local AI stack. Apart from that, Apple opened up about Siri and said, "We're continuing our work to deliver the features that make Siri even more personal. This work needed more time to reach our high-quality bar, and we look forward to sharing more about it in the coming year." So it looks like the upgraded, AI-powered Siri is coming next year. However, Apple Intelligence features are coming to more languages, including French, German, Italian, Spanish, and more.
[5]
Apple's Foundation Models Framework Empowers Third-Party Developers With Direct Access To On-Device Apple Intelligence, Enabling Seamless Integration Of Fast, Private, And Powerful AI Features In Their Apps
The WWDC 2025 revolved around the new design language Apple opted for across its systems for a more unified experience and the Apple Intelligence features integrated to make the experience even better, with more personalization. While the keynote kicked off with AI-powered capabilities and how they remain at the core, Apple also introduced a new framework called the Foundation Models that is meant to help developers access the AI features more seamlessly but without compromising on performance and privacy. Apple made a major announcement today about bringing a new framework called the Foundations Model, which amplifies the company's broader AI strategy as it is here to open up access to Apple Intelligence systems to third-party developers without the need to develop or host their own models. It can also be referred to as a greater emphasis on local execution as the models would be running on-device AI models powered by Apple Intelligence. This would allow third-party apps to integrate AI-driven features without having to rely on cloud infrastructure and allow developers to access capabilities such as image generation, summarization, and other key features without internet connectivity. Since the framework follows an on-device architecture, Apple's core values centered around user privacy would remain intact as the user data stays private and does not leave the device. As the tech giant moves further into its AI integrations, this step is huge in terms of offering more developer tools and the groundwork it is laying out for ensuring a secure ecosystem. The step is pivotal as it marks a new direction Apple has taken regarding AI. This is the first time developers will access the on-device foundational models, and Apple Intelligence capabilities are being extended to third-party applications. This step also sets the direction of future app development since the company is approaching on-device AI as a foundation step. This framework will debut with iOS 26 and other platform updates, highlighting Apple's commitment to bringing deeply integrated AI across its ecosystem.
Share
Copy Link
Apple introduces the Foundation Models framework at WWDC 2025, allowing developers to access on-device AI models for enhanced app functionality while prioritizing privacy and offline capabilities.
At the Worldwide Developers Conference (WWDC) 2025, Apple made a significant announcement that could reshape the landscape of AI integration in mobile applications. The tech giant unveiled its new Foundation Models framework, a groundbreaking initiative that allows third-party developers to access Apple's on-device AI models 1.
Source: engadget
The Foundation Models framework is designed to let developers tap into Apple's AI models in an offline, on-device fashion. This approach aligns with Apple's longstanding commitment to user privacy while still offering powerful AI capabilities. Craig Federighi, Apple's VP of software engineering, emphasized that this framework will enable apps to use on-device AI models created by Apple to drive experiences without compromising user data 2.
Apple has made the integration process remarkably straightforward for developers. The Foundation Models framework offers native support for Swift, Apple's programming language, and the company claims that developers can access Apple Intelligence models with as few as three lines of code 1.
Source: Wccftech
The framework opens up a range of possibilities for third-party apps, including image creation, text generation, and more 3. Importantly, these features will be available even when users are offline, as the AI models run locally on the device. This not only ensures privacy but also eliminates cloud API costs for developers 4.
While the specifics of Apple's AI model remain unclear, the company has been steadily expanding its AI capabilities. Apple Intelligence, the family of models powering various iOS features, is set to support more languages, including French, German, Italian, and Spanish 4.
Source: TechCrunch
This move by Apple represents a significant shift in its AI strategy. By providing developers with direct access to on-device foundational models, Apple is laying the groundwork for a new era of app development centered around on-device AI 5. The Foundation Models framework is expected to debut with iOS 26 and other platform updates, signaling Apple's commitment to deeply integrating AI across its ecosystem.
As the tech industry continues to grapple with the balance between AI advancement and user privacy, Apple's approach with the Foundation Models framework could set a new standard for responsible AI integration in consumer technology.
Summarized by
Navi
Apple's approach to AI focuses on seamless integration into existing apps and services, prioritizing user experience and trust over flashy features. This strategy aligns with consumer preferences and addresses concerns about AI reliability and privacy.
19 Sources
Technology
23 hrs ago
19 Sources
Technology
23 hrs ago
Apple researchers have published a study questioning the true reasoning capabilities of advanced AI models, revealing fundamental limitations in their problem-solving abilities as complexity increases.
13 Sources
Science and Research
15 hrs ago
13 Sources
Science and Research
15 hrs ago
A high-stakes legal battle between Getty Images and Stability AI commences in London's High Court, marking the first major copyright trial in the generative AI industry. The case centers on the alleged unauthorized use of Getty's images to train Stability AI's Stable Diffusion model.
8 Sources
Policy and Regulation
15 hrs ago
8 Sources
Policy and Regulation
15 hrs ago
Microsoft partners with Asus to launch two new handheld gaming devices, the ROG Xbox Ally and ROG Xbox Ally X, running Windows 11 and offering access to multiple game stores and Xbox services.
16 Sources
Technology
23 hrs ago
16 Sources
Technology
23 hrs ago
Apple announces access to its AI models for app developers and introduces a significant redesign of its operating systems, marking a strategic move in the AI race.
3 Sources
Technology
7 hrs ago
3 Sources
Technology
7 hrs ago