8 Sources
8 Sources
[1]
Apple to Reportedly Pay Google $1 Billion a Year for Siri's Custom Gemini AI Model
Imad is a senior reporter covering Google and internet culture. Hailing from Texas, Imad started his journalism career in 2013 and has amassed bylines with The New York Times, The Washington Post, ESPN, Tom's Guide and Wired, among others. Apple is inking a deal with Google to make a custom Gemini AI model to power the next version of its virtual assistant Siri for spring 2026, according to a Nov. 2 report by Bloomberg's Mark Gurman. A subsequent report on Wednesday states Apple will pay $1 billion a year for this new 1.2 trillion parameter model. Apple reportedly was evaluating whether to use Google or AI competitor Anthropic for the next version of Siri. While Apple will pay Google $1 billion a year, it would have cost $1.5 billion per year with Anthropic, according to another Bloomberg report. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. This custom Gemini model will run on Apple's private cloud computer servers. Apple's own models will continue to run on devices for personal data, while Gemini will operate on servers for more complex tasks. Apple doesn't plan to highlight Google's involvement in the company's marketing, Bloomberg reports. Google declined to comment. Apple and Anthropic didn't respond to requests for comment. Why Apple doesn't build its own AI With major tech companies pivoting toward AI, Apple has largely been left behind. The tech company was slow to adopt the technology and hasn't been able to develop competitive AI models. It instead turned to companies like OpenAI, the creator of ChatGPT, to help add generative capabilities on top of existing Apple systems. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) Even so, the promise of a truly agentic Apple Intelligence has failed to materialize, although it has improved. Apple CEO Tim Cook also hasn't ruled out the possibility of acquiring an AI company. As to why Apple hasn't made significant strides in AI despite being one of the richest companies in the world has likely to do with prioritization. Apple, as a company, focuses on privacy for its customers. Given how data intensive AI systems rely on cloud servers, Apple has been more keen on local AI models, ones that can run on a device. The problem is that AI is computationally intensive and smaller local models simply can't compete with server stacks. Training foundational models also requires a lot of expertise and expenditure. While OpenAI hasn't given exact numbers, estimates put the training cost of GPT-5 to be over $1 billion. AI researchers also command high salaries and like to openly publish their research. For Apple, partnering with existing AI companies to integrate their technology better aligns with its culture. Apple also has a close relationship with Google. The search giant pays Apple $20 billion annually to remain the default search engine on Apple devices. It's a relationship that nets Apple money and prevents it from building a competitor to Google Search, a claim Apple denies. This arrangement was a key factor in the Department of Justice's case against Google, in which a judge ruled that the company was operating an illegal monopoly. Despite falling behind on AI, Apple is doing well financially. Last month, it surpassed a $4 trillion market cap.
[2]
Apple reportedly doesn't want you to know Siri will be running partly on Google's Gemini
Last week's reports revealed that Apple will pay Google about $1 billion a year for the custom Gemini model. Last week's reports revealed that Apple will pay Google around $1 billion a year for a custom-built Gemini AI model to power the next version of Siri, expected to debut with iOS 26.4 next spring. It's a huge deal between two rivals, but according to a new report, Apple isn't planning to draw attention to who's behind the technology. Bloomberg's Mark Gurman says Apple has no plans to publicly acknowledge that Gemini will partly power Siri. Internally, the Gemini-based model is referred to as AFM v10, short for "Apple Foundation Model version 10," a name chosen to obscure the partnership and avoid confusing staff and customers. Gurman adds that Apple prefers to present it as part of its own system, even though much of the core model isn't. The partnership helps explain how Apple has managed to speed up a project that was reportedly falling behind schedule. Earlier this year, internal data suggested the company's AI tools weren't gaining traction and that its major Siri overhaul could be years away. Borrowing Google's model may be the quickest path to closing that gap, even if it means leaning on its biggest competitor in the process. Apple will run the Gemini-based system on its own Private Cloud Compute servers, keeping user data away from Google, and still hopes to replace it with a trillion-parameter in-house model at a later time. Until that happens, Apple's long-awaited AI comeback will partly rely on Gemini, but the company won't be shouting this from the rooftops.
[3]
New Siri rumor could give Apple and users exactly what they need - 9to5Mac
Siri's delayed AI upgrades are still expected to ship next spring in iOS 26.4, and a new report this week about Google's tech powering the features might be exactly what Apple and its users need. Apple's inability to ship the Siri upgrades it promised in iOS 18 has been embarrassing for the company. But all signs point to those enhancements coming soon, supplemented by even better improvements than what was promised. Apple Inc. is planning to use a 1.2 trillion parameter artificial intelligence model developed by Alphabet Inc.'s Google to help power its long-promised overhaul of the Siri voice assistant, according to people with knowledge of the matter. The iPhone maker is banking on Google's help to rebuild Siri's underlying technology, setting the stage for a new slate of features next year. The Google model's 1.2 trillion parameters -- a measure of the AI software's complexity -- would dwarf the level of Apple's current models. Reactions to this news have been all over the place. Some are celebrating, others are concerned about privacy implications, and still others see this deal as yet another sign of Apple's AI weakness. Personally, I lean toward optimism. While it would be great if Apple was able to deliver the best possible AI-infused Siri without outside help, that's clearly not the case right now. But by striking this deal, Apple could give itself and its users exactly what's needed. Whether you agree with the sentiment or not, there's a common belief that Apple is behind in AI. Tim Cook defended his company's AI position last year with a common Apple refrain: "Not first, but best." It's not unusual for Apple to take extra time refining and perfecting its work while competitors ship subpar offerings. Clearly though, Apple Intelligence hasn't gotten anywhere close to that "best" standard yet. But by striking this deal with Google, Apple has bought itself more time. The company has been under immense pressure to deliver its long-promised AI version of Siri. With Google's help, it will do so in just a few months. Shipping the new Siri will benefit Apple, since it can continue working away on its own in-house models with lower stakes. And it will also benefit users, who just want the best Siri product. Most users likely won't have any idea that Google's tech is powering the new Siri features -- and they wouldn't care if they did know. Apple's deal with Google reportedly ensures the company holds to its high privacy standards. So from a user standpoint, all that will matter is Siri becoming more powerful and reliable than ever. Apple can ship its major AI overhaul for Siri, delivering what users have long wanted, while continuing to work behind the scenes to improve its in-house LLMs. Everybody wins. What's your take on the Apple-Google deal to power the new Siri? Let us know in the comments.
[4]
Apple's $1B Google AI deal will be great for iPhone users. Until it's not
Apple is making the right move to get Siri back on track, but this is a core technology Apple has to be able to make, not buy. The news of Apple's pending deal with Google says a lot. While neither Apple nor Google will likely ever publicly acknowledge it outside of necessary financial disclosures, Apple fans should take note. It's a very good thing for users. And also very concerning. We should be equally pleased and worried. The deal, as it has been reported (Apple has not officially acknowledged it) will see Apple paying Google about $1B a year to use a customized version of its Gemini AI model for the new Siri, which should be released to users in the spring. The model is big and advanced, with a reported 1.2 trillion parameters, and will run on Apple's own Private Cloud Compute servers so neither Google nor anyone else gets to scoop up your data. The partnership seemingly was struck after Apple evaluated Google's AI, along with others from Anthropic and OpenAI, against its own internally developed LLM technology. As a user, this is all pretty good news. The latest versions of Gemini are among the top LLMs in the industry -- benchmarks vary, and Apple's version might not be the same as Gemini 2.5 Pro, but it's clear that Apple isn't going with a second-rate model here. Of course, talking to the new Siri won't be just like talking to Google's Gemini. For one, the voices will sound different, but they'll also have different priorities and tuning, and Siri will have access to the private data stored on your phone. You could think of it as two completely different cars that have the same engine but different options and chassis. The fact that Apple was willing to break out the checkbook to use a core technology from another company for one of its most important (and oft-maligned) features speaks volumes to a change of mindset in Cupertino. When Apple needs new core technology, it usually builds it or buys a company that already has (often both). That Apple is willing to step away from its homegrown mentality to deliver a new Siri that doesn't disappoint is worthy of applause. But Apple fans should also be wary. I'm generally critical of Apple's "Not Invented Here" ethos, where it seemingly needs to own or build everything, whether it's good for its users or not. There are lots of examples of that working out well -- it took a long time, but Apple's cellular modems and N1 networking chips give an experience at last as good as the Qualcomm and Broadcom stuff did -- , but there instances where Apple's stubbon reliance on in-house tech didn't make sense. For example, when OpenGL outlived its usefulness as a graphics API, Apple could have moved to the open Vulkan standard that replaced it, helping shape its future. Instead, it developed its own graphics API, Metal, and I'm not convinced that it was better for developers or users. I don't think Apple needed its own lossless audio format. Perhaps most notably, Google pays Apple some $20 billion a year for Google to be the default search engine for Safari. And yes, there are other search engine options, but we all know almost nobody strays from the default, which is why it's worth so much to Google. That hasn't been good for users. Google has been steadily degrading its search results experience while using the data from all those searches to consolidate its control over search and web advertising. If there's anything Apple should have invested years ago, it is building its own privacy-minded, ad-free web search. And we all know about the Apple Maps fiasco. The company's attempt to stop relying on a third-party mapping service resulted in a terrible product, ironically because that product was built with a mishmash of data that it didn't own or control. It took years for Apple to build a Maps experience using all its own data, and now that it has, the experience is top-tier. So it's clear that some core technologies Apple needs to build for itself and have total control over, while others it can and probably should find outside solutions for. A foundation AI large language model is definitely in the former category. As the years roll on, AI models are going to be part of so much more than chatbots. AI models are all over Apple's products, from cameras recognizing your gestures to image editing to notification summaries and more. But the big foundational LLM that interacts with users and does everything from controlling our devices to gathering information about the world, that's the most important AI model in the stack. Apple having its own top-tier LLM is as important as Apple controlling any other major piece of its technology stack. It's arguably going to be more important than Apple having its own web browser. It's great news that Apple recognized that its own internally developed LLM isn't good enough right now, and is willing to go to Google to solve the problem. But in the end, Apple desperately needs to catch up or surpass the technology it is buying. And it's not clear if it has the ability to do that, as competitors' LLMs continue to improve and Apple seemingly loses AI talent every week. As a user, you should be glad about the Apple-Google-Gemini-Siri AI deal. As long as it doesn't last.
[5]
Apple's AI play may hinge on Google
The big picture: Using Google's Gemini under the hood could create a stronger pocket assistant, but much will come down to how well Apple can stitch together its voice assistant with Google's technology. Driving the news: Bloomberg reports Apple is nearing a $1 billion-a-year agreement to run a custom version of Gemini on Apple servers. Between the lines: Apple laid out a compelling vision for Apple Intelligence and an improved Siri last year, but failed to deliver. * Siri still relies on Apple's in-house tech, but can hand off some questions to ChatGPT with user permission. * Apple has promised since 2024 that Siri would be able to take on a broader range of queries -- using users' private information, but only for the purpose of answering the question and in ways that even Apple itself wouldn't be able to access. * Running a custom version of Gemini on its own servers opens a big question of how much data, if any, Google will be able to access. Yes, but: Just because Gemini gets built into Siri doesn't mean that's the assistant all iPhone users will choose, even if it's the default choice. * If Gemini falls short in its capabilities, interface or privacy, people may choose -- as many do today -- to use ChatGPT or another assistant. * Expect all the AI players to keep pushing their apps, including OpenAI, Meta and even Google, which would still prefer people to use the version of Gemini that runs on its server, giving it far more data. Reality check: Apple and Google aren't commenting. Tim Cook suggested on a previous earnings call that the company needed its own frontier models, but said he considered that information proprietary. * Apple says the new Siri will arrive next year, but it has already faced multiple delays. * "We're making good progress on it, and as we've shared, we expect to release it next year," Tim Cook said last week during a call to discuss its quarterly earnings. Our thought bubble: Partnering with Google signals Apple may see less value in competing head-on in foundation model development. The bottom line: Siri, pop us some popcorn, this could get interesting.
[6]
Apple Could Pay Google $1 Billion a Year to Use its AI Model for Siri
Apple eventually wants to develop a trillion-parameter model for Siri Apple will reportedly pay Google about $1 billion a year to access its custom Gemini artificial intelligence (AI) model that will power the promised features in Siri. As per the report, the Cupertino-based tech giant does not want to go public with any announcement. This is likely being done to avoid any connection with its rival. Earlier, a report claimed that Apple was only planning to use Gemini to partially power Siri; however, the new report suggests that the AI model might play a larger role. Apple to Reportedly Power Siri With Gemini Much has been written about Apple's promised Siri upgrade, which was first unveiled at the Worldwide Developers Conference (WWDC) 2024. Nearly one and a half years later, it has still not been delivered. The tech giant did everything, from delaying its release date to completely stopping talking about it, to reducing public scrutiny, but the company CEO is regularly being questioned about it by the investors. Some reasons reported about the delay include engineering struggles, the departure of key employees working on the project, and the Apple Foundation Models simply not being capable enough. It appears the company has now acknowledged this and is looking outside to fix the Siri problem. Bloomberg's Mark Gurman claimed in the Power On newsletter that Apple has struck a deal worth about $1 billion a year with Google to use its Gemini model for Siri. It is said to be a custom AI model with 1.2 trillion parameters, a behemoth compared to Apple's 150 billion-parameter models. To put it into context, parameters in an AI model are the pieces of knowledge it learns from data, stored in small digestible packets. They help it understand context and make more accurate predictions or responses. So, the more parameters a model has, the more accurate and efficient it will be. Interestingly, the report claims that Apple is trying to prevent the development from becoming public knowledge. It reportedly does not want people to know that it is relying on a rival's technology to the point that it is calling the Google-built model Apple Foundation Model v10 (AFM v10). It is said to be done to avoid confusion among employees and customers and to focus that the company's internal architecture is still the driving force behind the upgraded Siri. The report also adds that eventually, the company wants to stop relying on Google and power Siri with its in-house models. The plan is said to be developing a one trillion-parameter AFM that will be available to power Apple Intelligence features, including Siri. However, the timeline of when it might be developed remains unclear.
[7]
Apple and Google join forces: Siri to get a major AI upgrade with Gemini integration
Apple is set to revolutionise Siri with the integration of Google's Gemini AI model, marking one of the biggest collaborations in tech history. The move could transform the iPhone experience, making Siri smarter, more conversational, and context-aware. Here's how Apple plans to use Google's advanced AI to redefine digital assistance and why this partnership could reshape the AI race in 2025. In a surprising yet game-changing move, Apple is reportedly partnering with Google to power its next generation of Siri using Google's Gemini AI model. This collaboration, confirmed through various industry sources, signals a major strategic shift for Apple, which has historically relied on its own in-house technologies. The integration of Gemini into Siri aims to bring deeper intelligence, faster response times, and a far more natural user experience to iPhone users worldwide. According to the report, Apple will leverage Google's Gemini model to enhance Siri's capabilities across devices, making it more capable of understanding context, emotions, and complex user queries. This could mark the biggest leap for Siri since its launch in 2011. With Gemini's generative AI technology, Siri will not just respond, it will think, summarize, and assist more proactively. Imagine asking Siri to draft emails, summarize articles, or even help plan your day seamlessly across apps; that's the level of intelligence Apple is aiming for. This move follows Apple's earlier announcement of its "Apple Intelligence" system, which blends on-device processing with cloud-based AI features for privacy-focused performance. The decision to tap into Google's Gemini suggests Apple's willingness to combine its secure ecosystem with external AI muscle to stay competitive in the rapidly evolving landscape of intelligent assistants. Google's Gemini, known for its multimodal capabilities, can process text, images, voice, and even video simultaneously. By embedding this technology into Siri, Apple could turn its voice assistant into a powerful AI companion capable of creative reasoning and multitasking. For users, this means Siri could finally catch up and possibly outshine rivals like ChatGPT and Alexa. For instance, Siri may soon be able to generate creative ideas, explain complex topics, or provide conversational summaries of ongoing discussions all in real time. However, this partnership also raises questions about data privacy and Apple's long-standing stance on user protection. Apple has built its brand around safeguarding user information, while Google's AI models typically rely on large datasets for training. To balance this, reports suggest Apple will use a hybrid approach running certain AI processes on-device for privacy while relying on Google's Gemini cloud features for more advanced computations. Industry experts view this collaboration as a turning point in the AI landscape. By integrating Gemini, Apple is acknowledging that generative AI innovation has moved faster than its in-house development could match. This could also signal a shift toward more open, cross-platform cooperation among tech giants, an approach that benefits consumers who are increasingly seeking smarter, more seamless digital experiences. Apple is expected to roll out the new Siri with Gemini integration in upcoming iOS updates, likely starting with select regions before expanding globally. The timing aligns with the company's broader vision to redefine personal AI on iPhones, iPads, and Macs making Siri not just a digital assistant, but a true AI partner that evolves with user behaviour and preferences. As Apple and Google join forces, the line between rivalry and collaboration blurs, hinting at a new era of AI-powered innovation. The next time you say, "Hey Siri," you might just be talking to the smartest version of her yet powered by the very AI that defines Google's technological edge.
[8]
Apple Buys the Brain: Google Gemini to Power Siri in $1 Billion AI Deal
Apple is undertaking a significant initiative to redefine Siri, its voice assistant, by integrating advanced artificial intelligence (AI) capabilities, according to recent rumors. At the heart of this effort lies a new $1 billion annual partnership with Google, using its Gemini AI technology. This collaboration is designed to address Siri's long-standing limitations while staying true to Apple's core principles of privacy and user trust. Despite the ambitious nature of this project, delays have pushed the full rollout to 2026, reflecting Apple's cautious and meticulous approach to innovation. The video below from SaranByte gives us more details. Apple's decision to partner with Google's Gemini AI underscores its commitment to delivering innovative functionality. Gemini's advanced capabilities are set to transform Siri into a more intuitive and capable assistant, allowing it to perform tasks that were previously out of reach. Key features include: These enhancements aim to address criticisms of Siri's limited utility and make it a more indispensable tool for users. Importantly, Apple will process all data on its private cloud infrastructure, making sure that user privacy remains a top priority. While the partnership with Google is a cornerstone of Apple's strategy, the company is also exploring other avenues to enhance Siri's capabilities. Apple is actively testing large language models (LLMs) from OpenAI and Anthropic, in addition to advancing its own in-house AI research. This diversified approach allows Apple to evaluate a variety of technologies, balancing factors such as performance, speed, and privacy. By doing so, Apple ensures that it selects the most effective solutions to drive Siri's evolution forward. Revamping Siri's architecture is a complex and ambitious undertaking. Apple's focus is on improving personalization, context awareness, and deeper app integration to create a more intelligent and responsive assistant. However, these lofty goals have led to delays in the project timeline. The revamped Siri is now expected to debut in spring 2026, coinciding with the release of iOS 26.4. Apple's cautious approach reflects its dedication to delivering a product that is both reliable and secure, even as competitors in the AI space continue to push forward with their advancements. Apple's emphasis on privacy remains a defining characteristic of its approach to AI development. By prioritizing on-device processing and minimizing reliance on cloud computing, Apple ensures that user data is kept secure and private. While this privacy-first philosophy has slowed the pace of AI development compared to some competitors, it aligns with Apple's broader mission to safeguard user trust. For tasks that require cloud computing, Apple will operate within its stringent privacy framework, making sure that even the most complex processes adhere to its high standards of data protection. The upcoming version of Siri is expected to introduce several advanced features designed to enhance its usability and relevance in everyday life. These updates include: These features aim to position Siri as a more proactive and indispensable digital assistant, capable of anticipating user needs and delivering meaningful support. Despite its ambitious plans, Apple has faced criticism over Siri's perceived stagnation and delays in delivering promised improvements. Public dissatisfaction and legal challenges have highlighted the pressure on Apple to meet user expectations. To address these concerns, Apple has adopted a cautious and methodical approach, focusing on delivering features that are both reliable and impactful. This strategy reflects Apple's commitment to quality over speed, making sure that the revamped Siri meets the high standards expected by its users. Looking ahead, Apple's long-term vision includes replacing Google's Gemini AI with its own trillion-parameter AI model by 2026. This in-house development is expected to further enhance Siri's capabilities while reducing Apple's reliance on external partners. The enhanced Siri, featuring advanced personalization, deeper app integration, and improved context awareness, is set to debut alongside iOS 26.4 in spring 2026. This milestone represents a significant step forward in Apple's efforts to redefine Siri's role in the rapidly evolving AI landscape. Gain further expertise in Google Gemini AI by checking out these recommendations.
Share
Share
Copy Link
Apple will reportedly pay Google $1 billion annually for a custom 1.2 trillion parameter Gemini AI model to power Siri's major overhaul, expected to launch in spring 2026. The partnership addresses Apple's AI development challenges while maintaining privacy through Apple's own servers.
Apple is reportedly finalizing a landmark deal with Google that will see the iPhone maker pay approximately $1 billion annually for a custom version of Google's Gemini AI model to power the next generation of Siri
1
. The partnership, expected to debut with iOS 26.4 in spring 2026, represents a significant shift in Apple's approach to artificial intelligence development and marks one of the largest AI licensing deals in the industry.
Source: Geeky Gadgets
The custom Gemini model will feature 1.2 trillion parameters, significantly dwarfing Apple's current AI capabilities
3
. Apple reportedly evaluated multiple AI providers, including Anthropic, before settling on Google's offering. The decision was partly financial – while Google's solution costs $1 billion annually, Anthropic's alternative would have required $1.5 billion per year1
.
Source: Macworld
Despite the substantial collaboration, Apple has no plans to publicly acknowledge Google's role in powering Siri's enhanced capabilities
2
. Internally, the company refers to the Gemini-based system as "AFM v10," short for "Apple Foundation Model version 10," a deliberate naming choice designed to obscure the partnership and avoid confusing staff and customers.This secretive approach aligns with Apple's preference to present AI capabilities as part of its own ecosystem, even when relying heavily on external technology. The company aims to maintain its brand image of technological self-sufficiency while addressing urgent competitive pressures in the AI space
2
.To address privacy concerns, Apple will run the Gemini-based system on its own Private Cloud Compute servers, ensuring user data remains separate from Google's systems
1
. Apple's own AI models will continue handling personal data processing on individual devices, while the more powerful Gemini model will manage complex server-side tasks.This hybrid approach represents Apple's attempt to balance computational power with its privacy-first philosophy. The arrangement allows Apple to deliver advanced AI capabilities without compromising its commitment to user data protection, a key differentiator in the competitive smartphone market
5
.
Source: ET
Related Stories
The partnership highlights Apple's acknowledgment that it has fallen behind in AI development despite being one of the world's most valuable companies
4
. Industry observers view the deal as both pragmatic and concerning – while it addresses immediate competitive needs, it also signals Apple's dependence on external AI technology for core features.The arrangement builds upon Apple and Google's existing relationship, where Google already pays Apple approximately $20 billion annually to remain the default search engine on Apple devices
1
. This financial interdependence between the two tech giants continues despite their competition in various market segments.Apple CEO Tim Cook has maintained that the company prefers to be "not first, but best" in technology adoption, but the delayed rollout of promised AI features has created pressure to accelerate development timelines
3
.Summarized by
Navi
[2]
[5]
30 Oct 2025•Business and Economy

23 Aug 2025•Technology

01 May 2025•Technology

1
Business and Economy

2
Technology

3
Policy and Regulation
