11 Sources
11 Sources
[1]
In Google earnings, analysts want answers on Apple's Siri-Gemini deal
Alphabet reports earnings Wednesday, and investors will be looking for more details on the company's deal with Apple to revamp the Siri virtual assistant with Google's Gemini artificial intelligence technology. While Wall Street is expecting Google to report a 15% year-over-year increase in its fourth-quarter revenue, much of the attention during the company's earnings call will be on any new details about the Siri-Gemini deal, analysts told CNBC. The earnings on Wednesday will be the first time Alphabet's leadership address shareholders since Apple in January announced that it had chosen Gemini as the AI technology to power the company's Siri overhaul this year. The deal represents one of the most prominent yet for Gemini, and the scale of Apple's user base -- 2.5 billion active devices -- is important for Google, even if the search giant doesn't get specific user data, analysts said. "They'll have critical mass, and even if they're not going to get consumer information, maybe they'll be able to see what queries are being asked, which could help Google train its AI models," said Gil Luria, managing director at technology research firm D.A. Davidson. However, there are still many unanswered questions about what the partnership entails. What the companies have said is that the multiyear partnership will lean on Google's Gemini and cloud technology for future Apple foundational models. A joint statement from the companies said Apple had "determined that Google's AI technology provides the most capable foundation for Apple Foundation Models," adding that the company was "excited about the innovative new experiences" that Gemini "will unlock for Apple users." The companies said the Apple Intelligence suite of features will continue to run on Apple devices and the iPhone maker's private cloud. Apple intelligence includes Siri as well as Apple's AI writing tools and Genmoji, the company's emoji creation feature. Apple CEO Tim Cook discussed the partnership with CNBC's Steve Kovach last week when the company reported its latest quarterly earnings. Cook said the Gemini-powered Siri will be personalized for users, but he added that it won't know users' "Gmail and stuff like that." Cook added that Siri will "know a lot" and said Apple plans to share more details when the updated Siri is released.
[2]
Google suggests Gemini-powered Siri will run on Google's servers - 9to5Mac
During Alphabet's Q4 2025 earnings call, CEO Sundar Pichai added to the confusion regarding where, exactly, the upcoming Gemini-powered Siri will run. Here's what he said. Since Apple confirmed that Google's Gemini would power new Siri features, there has been a lingering question about the privacy aspects of what Tim Cook refers to as a "collaboration" between the two companies. While many assumed that Google would have access to user data, Apple vaguely countered this notion with its usual privacy-first speech. Here's Apple's original statement on the collaboration: "After careful evaluation, we determined that Google's technology provides the most capable foundation for Apple Foundation Models and we're excited about the innovative new experiences it will unlock for our users." Currently, Apple's foundation models run either on-device or on Private Cloud Compute (PCC), Apple's cloud AI infrastructure that maintains user privacy when data needs to be uploaded for inference that goes beyond what on-device models can deliver. While Apple's statements made it seem like the Gemini-powered Siri would run on its own infrastructure, Bloomberg reported a few days later that this would likely not be the case: In a potential policy shift for Apple, the two partners are discussing hosting the chatbot directly on Google servers running powerful chips known as TPUs, or tensor processing units. The more immediate Siri update, in contrast, will operate on Apple's own Private Cloud Compute servers, which rely on high-end Mac chips for processing. A few days after that, during Apple's Q4 2025 earnings call, Tim Cook volunteered the following information when analyst Ben Reitzes asked how Apple had decided to partner with Google, and if there was "an opportunity (...) to share in revenue too": Yeah, we basically determined that Google's AI technology would provide the most capable foundation for Apple Foundation Models. And we believe that we can unlock a lot of experiences and innovate in a key way due to collaboration. We'll continue to run on the device, and run in Private Cloud Compute, and maintain our industry-leading privacy standards in doing so. In terms of the arrangement with Google, we're not releasing the details of that. This seemed to reaffirm (albeit vaguely) the notion that the Gemini-powered Siri would run on Apple's infrastructure, and that Cook was simply refusing to discuss the financial aspects of the partnership. Which brings us to today. During Alphabet's Q4 2025 earnings call moments ago, CEO Sundar Pichai's prepared remarks included the following statement: "We are collaborating with Apple as their preferred cloud provider and to develop the next generation of Apple Foundation Models, based on Gemini technology." A few moments later, Chief Business Officer Philipp Schindler made an almost identical statement during his own prepared remarks: "I would start by joining Sundar in how pleased I am that we are collaborating with Apple as their preferred cloud provider and to develop the next generation of Apple Foundation Models, based on Gemini technology." Putting all recent vague statements together, it is increasingly looking like: So far, neither Apple nor Google has given a definitive answer on where exactly the Gemini-powered Siri will run, so there must be a reason for that. It could be because, per Bloomberg's report, the companies are still ironing out the details, and the rollout may be phased, with just "the more immediate Siri update" running on Apple's PCC. Earlier tonight, Bloomberg's Mark Gurman doubled down on his previous report, adding that Apple might be referring to Siri and Apple Intelligence as separate systems, running on separate infrastructures: Be it as it may, it is clear that neither company is ready to directly address the technical aspects of the so-called collaboration. Every statement about where the new Siri will run has been vague and conflated with seemingly unrelated aspects and issues. As things stand, it increasingly appears that the Gemini-powered Siri will run on Google's infrastructure, despite both companies' refusal to actually confirm that. Whether Apple will address the privacy aspects of the collaboration when it officially announces the Gemini-powered Siri, remains to be seen.
[3]
No, Google didn't just admit that Siri will run on its servers
But Sundar Pichai's alarming-sounding statement did create some unnecessary confusion. Ever since Apple and Google jointly announced a partnership to base the Siri voice assistant on Gemini tech, questions have been raised about user privacy. Apple has done its part to offer reassurance by insisting that Gemini-powered Siri will continue to run on-device or on the company's own Private Cloud Compute (PCC) servers, rather than on infrastructure owned by Google. But doubts remain. At its triumphant Q1 earnings call late last month, CEO Tim Cook responded to a question about the Gemini partnership by saying, in part: "We'll continue to run on the device, and run in PCC, and maintain our industry-leading privacy standards in doing so. In terms of the arrangement with Google, we're not releasing the details of that." As 9to5Mac points out, that's worded in such a way that it could mean other Apple Intelligence features will be hosted in the way described, whereas the new version of Siri will be handled in some other way. I don't think he was saying that (the "arrangement with Google" almost certainly refers to the financial aspect of the question), but it's one possible, albeit tenuous, interpretation. Those doubts resurfaced on Wednesday following another earnings call, this time by Google. The company's CEO, Sundar Pichai, mentioned the partnership in his remarks in the following way: "I'm pleased that we are collaborating with Apple as [its] preferred cloud provider and to develop the next generation of Apple Foundation Models, based on Gemini technology." Hearing Google refer to itself as Apple's "preferred cloud provider" while discussing the Gemini partnership is undoubtedly alarming, and I don't blame pundits for drawing negative conclusions. One news site (9to5Mac, link above) interpreted this as Google "suggesting" Gemini-powered Siri will run on its servers; another (AppleInsider) said Apple and Google's statements were "seemingly contradictory." But once again, Pichai's statement contains vague implications rather than a clear meaning, and I don't think we should take this as a direct contradiction of Apple's privacy claims-something which would in effect be a declaration of PR war. Much like Cook's statement, there are multiple interpretations of what Pichai said. One is that Apple was lying through its teeth and Gemini-powered Siri will run on Google servers in a way that gives Google access to user data. Another is that Google is simply going to lease server hardware to Apple as part of the arrangement. They will be Apple Park servers running Apple software, but the hardware is "provided" by Google. That would likely fulfil Apple's privacy pledge, even if the optics might not be great. And another is that Pichai was, clumsily and at the worst possible time, referring to the fact that Google already provides extensive cloud services to Apple in other areas, and that this latest partnership is merely one more example of the two tech giants working fruitfully together. In other words, the whole thing is frustratingly vague. What we have at the moment is a partnership with many details still to be decided, and the two partners have very different priorities in terms of public messaging. Apple wants to reassure its users at every point that their privacy will be safeguarded, while Google just wants everyone to know that its Gemini AI platform is more important and successful than ChatGPT. So the result is a lot of statements that differ in implication but don't actually say very much. There is a fourth option, proposed by Bloomberg's Mark Gurman. This is that Gemini-powered Siri will run, as Apple has consistently said, on-device or on Apple servers, but that another, more radical version of the voice assistant, a chatbot codenamed Campos, will launch later and run on Google servers. That's going to be a tough sell, assuming Gurman is correct (and he says only that the two companies are discussing the idea). But Apple will at least have some more time to make the case to its users.
[4]
Apple Explains How Gemini-Powered Siri Will Work
Apple CEO Tim Cook yesterday reiterated the structure of its partnership with Google to use Gemini AI models for the next generation version of Siri. During the company's Q1 2026 earnings call yesterday, Apple CEO Tim Cook and CFO Kevan Parekh were asked several questions about Apple Intelligence and the company's recently announced deal with Google to power the personalized version of Siri using Gemini. We basically determined that Google's AI technology would provide the most capable foundation for AFM (Apple Foundation Models), and we believe that we can unlock a lot of experiences and innovate in a key way due to the collaboration. We'll continue to run on the device and run in Private Cloud Compute and maintain our industry-leading privacy standards in doing so. In terms of the arrangement with Google, we're not releasing the details of that. That description closely matches language from Apple and Google's earlier joint announcement, which said that Apple Intelligence would continue to operate on Apple hardware and Private Cloud Compute. Cook also addressed Apple's own artificial intelligence development efforts, noting that the company continues to build its own technology alongside the Gemini partnership, but clarified that those efforts do not replace Google's role in the personalized Siri system. You should think of it as a collaboration. And we'll obviously independently continue to do some of our own stuff, but you should think of what is going to power the personalized version of Siri as a collaboration with Google. When asked about monetization and return on investment, Cook framed Apple Intelligence as a feature integrated across Apple's platforms rather than a discrete revenue driver. We're bringing intelligence to more of what people love and we're integrating it across the operating system in a personal and private way, and I think that by doing so, it creates great value, and that opens up a range of opportunities across our products and services. And we're very happy with the collaboration with Google as well, I should add. Neither Cook nor Parekh disclosed how many users currently have access to Apple Intelligence features or whether those capabilities are driving hardware upgrades. Apple previously acknowledged that Apple Intelligence is limited to devices with sufficient memory and processing capacity, which constrains availability somewhat.
[5]
You could get a much smarter Siri on your phone this year
Let's be honest: for a long time, Siri has been the "dumbest" smart assistant in the room. We all have that shared experience of asking Siri a simple question, only to be met with "Here's what I found on the web," or worse, total misunderstanding. It has been years of incremental, barely-noticeable updates while competitors raced ahead. But if recent reports are accurate, Apple is finally ready to stop the bleeding and give Siri the brain transplant it desperately needs - and they are doing it with help from an unlikely source. The "Frenemy" Collaboration In a move that likely raised eyebrows across Silicon Valley, Apple is reportedly teaming up with its biggest rival, Google. The company has confirmed a multi-year partnership to power the next generation of Apple Intelligence using Google's Gemini AI models. It is a rare moment of pragmatism from Apple. They seem to have realized that their siloed, "we-do-everything-in-house" approach wasn't working fast enough in the generative AI arms race. By leveraging Gemini's existing cloud infrastructure and advanced models, Apple is effectively hitting the turbo button to catch up with the rest of the industry. Phase One: Spring Cleaning for Siri We won't have to wait long to see if this gamble pays off. The first wave of these changes is expected to arrive as early as this spring with iOS 26.4. Currently slated for beta testing in February, this update isn't going to turn Siri into a full-blown chatbot just yet, but it is going to make it significantly less frustrating. Think of this as the "context" update. The goal here is to fix Siri's notorious inability to understand what is actually happening on your screen or in your life. Instead of just being a glorified timer-setter, the Gemini-powered Siri will reportedly have better "on-screen awareness" and deeper control within apps. It means Siri might finally understand that when you ask about "that email," you are referring to the one currently open on your screen. This hybrid approach will run on Apple's Private Cloud Compute system, aiming to balance Google's brainpower with Apple's obsession with user privacy. Phase Two: The Chatbot Era The real revolution, however, is being saved for later in the year. Apple is reportedly working on a second, much more ambitious phase likely to debut around iOS 27 -- potentially showcased at WWDC later in 2026. This is where Siri is expected to transform into a true conversationalist, capable of the kind of back-and-forth dialogue we see today with ChatGPT or Gemini 3. Imagine a Siri that doesn't just answer a question but remembers the context five minutes later, proactively suggests tasks based on your habits, and handles complex, multi-step requests without stumbling. This isn't just a patch; it's a reimagining of the assistant as a genuine digital companion rather than a voice-activated remote control. Recommended Videos For iPhone and Mac users, this overhaul is long overdue. The landscape of AI has shifted dramatically, and simple command-and-control voice assistants feel archaic compared to modern generative models. By swallowing its pride and partnering with Google, Apple is signaling that it is finally serious about making Siri useful again. If they pull this off, 2026 might be the year we finally stop shouting at our phones in frustration and start actually conversing with them.
[6]
Apple confirms Gemini-powered Siri will use Private Cloud Compute - 9to5Mac
During today's fiscal Q1 2026 earnings call, Apple CEO Tim Cook and CFO Kevan Parekh fielded, albeit vaguely, questions regarding the recent deal to have Gemini power the next-gen Siri. Here's what they said. Today's fiscal Q1 2026 conference call had two main themes: memory constraints, and AI. And while some questions focused on how Apple plans to monetize Apple Intelligence, others pressed executives on the technical and business aspects of the 'collaboration,' as Cook repeatedly put it." As expected, Apple declined to provide any specific details as to the terms of the deal. Still, Cook did share one piece of information that cleared up some of the confusion that followed Apple and Google's initial joint announcement, as to where exactly the models would run: "We believe that we can unlock a lot of experiences and innovate in a key way due to the collaboration. We'll continue to run on the device and run in Private Cloud Compute, and maintain our industry-leading privacy standards in doing so." Interestingly, Cook also mentioned that Apple will "obviously independently continue to do some of [its] own stuff," but made it clear that Apple's own in-house developments will not affect the Gemini deal: "The personalized version of Siri is a collaboration with Google." As to the question regarding Apple's expected return on investment when it comes to Apple Intelligence, Cook said: Well, let me just say that we're bringing intelligence to more of what people love and we're integrating it across the operating system in a personal and private way. And I think that by doing so, it creates great value and that opens up a range of opportunities across our products and services. (...) And we're very happy with the collaboration with Google as well." When asked about the percentage of iPhone users that currently have access to Apple Intelligence-powered features, and how much that is driving sales and upgrades, Cook and Parekh declined to comment.
[7]
Alphabet details Gemini-Siri partnership with Apple on earnings call
Alphabet executives will discuss their new artificial intelligence partnership with Apple during the company's fourth-quarter earnings call on Wednesday, following Apple's January announcement selecting Google's Gemini AI to overhaul Siri. Analysts anticipate roughly 15 percent year-over-year revenue growth for Google in the fourth quarter. Their attention centers on details emerging from the Siri-Gemini collaboration during the earnings call. This marks the first occasion for Alphabet leaders to address shareholders since the partnership's reveal. The arrangement positions Gemini centrally within Apple's forthcoming foundation models. These models underpin upcoming Apple Intelligence capabilities, including a more personalized Siri slated for release later this year. Bloomberg reports specify that Apple commits approximately $1 billion annually for a tailored Gemini model featuring 1.2 trillion parameters. This represents a substantial enhancement over Apple's existing 150 billion-parameter system. The multi-year agreement secures Google's advantage against rivals such as OpenAI and Anthropic. Michael Nathanson of MoffettNathanson stated to CNBC, "There was significant concern on Wall Street that Apple might lean toward OpenAI or Perplexity, but that now seems less likely." Apple opted for Google in part because Anthropic demanded about $1.5 billion per year. The deal builds upon an established financial tie between the firms. Google provides Apple an estimated $20 billion each year to serve as the default search engine across Apple devices. This ongoing payment underscores the depth of their commercial interdependence, now extending into artificial intelligence infrastructure. Bloomberg journalist Mark Gurman indicates Apple intends to introduce the Gemini-enhanced Siri during the latter half of February. Developers can expect the update within iOS 26.4 beta. The revised assistant gains improved capacity to interpret on-screen content. It draws upon personal context for task execution and generates responses in a more conversational style. Apple outlined these functionalities at WWDC 2024, though technical obstacles postponed their implementation. During Apple's Q1 2026 earnings call last week, CEO Tim Cook referenced the partnership. He affirmed, "the more personalized Siri is coming this year," while verifying the collaboration with Google. Cook specified that Siri lacks access to users' Gmail "and stuff like that." Apple Intelligence operations persist on Apple hardware and Private Cloud Compute servers. Both companies emphasize adherence to "industry-leading privacy standards" in these deployments. Certain analysts raise concerns regarding Apple's growing dependence on Google's cloud systems. They posit this could enable the search company to obtain additional user data progressively. The development prompts inquiries into Apple's prior integration of OpenAI's ChatGPT with Siri for intricate queries. Apple informed CNBC of "no alterations to the existing agreement." Details on how the partnerships interact remain unspecified. One analyst characterized the arrangement as positioning ChatGPT in a "supporting role" on Apple platforms.
[8]
Apple confirms Gemini runs on Private Cloud Compute for privacy - Phandroid
Apple CEO Tim Cook just addressed privacy concerns about the company's Gemini partnership during Apple's fiscal Q1 2026 earnings call. With Google's AI now powering the next generation of Apple Intelligence features, some users worried about data sharing with Google. Cook made it clear that won't happen. "We believe that we can unlock a lot of experiences and innovate in a key way due to the collaboration," Cook said. "We'll continue to run on the device and run in Private Cloud Compute, and maintain our industry-leading privacy standards in doing so." The statement confirms that despite using Google's Gemini models, user data stays within Apple's ecosystem. The Apple Gemini deal reportedly costs around $1 billion per year and will power a smarter Siri coming later this year. But Apple's not compromising on privacy to make it happen. Apple Private Cloud Compute is designed to handle AI tasks too complex for on-device processing. Unlike typical cloud AI services that store conversation history and build user profiles, Apple treats each interaction as isolated. No data gets stored on Google's servers. The system anonymizes requests before any processing happens. For simpler tasks, Apple's on-device AI handles everything locally without touching the cloud. Only more complex queries that need Gemini's power get sent to Apple Private Cloud Compute. However, even then identifiers linking back to users are removed. This hybrid approach gives Apple the benefits of Google's advanced AI without the usual privacy tradeoffs. Analyst Ming-Chi Kuo calls the partnership a temporary solution while Apple builds its own AI infrastructure, but Cook's comments show the company is committed to maintaining privacy standards throughout.
[9]
Perhaps the Gemini deal isn't the Apple AI partnership we should be focused on?
Apple may be basing the Siri revamp on Google Gemini, but internally, Apple reportedly prefers a different AI chatbot. Apple 'runs on Anthropic' - the AI software company behind the consumer-facing Claude chatbot, according to a well-connected Apple Insider. After Apple signed a wide-ranging agreement with Google for Gemini to power the new-and-improved Siri assistant coming this year, the new Bloomberg reporter claims to shed light on the internal AI tools Apple uses for product development tasks. In an interview with TPBN (via 9to5Mac), Gurman revealed Apple has deployed custom versions of the Claude chatbot littered across its internal servers. Interestingly, Gurman says, Apple was so enamoured with Claude, it was initially planning to base the Siri revamp on Anthrophic's platform. However, Google was cheaper. Much cheaper. Gurman says: "Apple runs on Anthropic at this point. Anthropic is powering a lot of the stuff Apple's doing internally in terms of product development and a lot of their internal tools." "They have custom versions of Claude running on their own servers internally, too. This Google deal just came together a few months ago. They were not going to use Google. Apple actually was going to rebuild Siri around Claude. But Anthropic was holding them over a barrel. They wanted a ton of money from them, several billion dollars a year, and at a price that doubled on an annual basis for the next three years." You can watch the clip in full below... Apple and Google announced their alliance to power the new conversational Siri in January and we may see it previewed as soon as this February. The statement read: "Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalised Siri coming this year. "After careful evaluation, Apple determined that Google's Al technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users. Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards."
[10]
Google Creates A Privacy Conundrum By Declaring It Has Become Apple's "Preferred Cloud Provider"
Apple launched its Private Cloud Compute framework with much fanfare back in 2024, placing privacy front and center in its overarching AI strategy. However, this critical differentiating factor - one that clinched plaudits from consumers - is now seemingly under existential threat, as per the latest comments emerging from Google's earnings call today. We already know that Apple is gearing up to launch a revamped version of Siri, likely with the upcoming iOS 26.4 update, bringing the much-delayed in-app actions, personal context awareness, and on-screen awareness to its bespoke voice assistant, enabling a wide variety of agentic actions across apps, based on personal data and on-screen content. To do so, Apple is planning to deploy a gigantic 1.2-trillion-parameter custom Gemini AI model on its cloud servers to power AI features under the ambit of its Private Apple Intelligence - where relatively simple AI tasks would be performed by using on-device models and the computational resources of the device itself, while the more complex tasks would be offloaded to Apple's private cloud servers using encrypted and stateless data for subsequent inference. This arrangement maintains Apple's privacy credentials by relying heavily on the iPhone maker's Private Cloud Compute framework. However, the prolific tipster, Mark Gurman, recently unveiled the first chink in Apple's privacy armor by disclosing that Apple is planning to launch a dedicated Siri chatbot that will run on Google's own TPUs and cloud infrastructure, possibly leased by Apple. According to Gurman, the Siri chatbot will launch with iOS 27 as a baked-in solution rather than debuting in the form of a standalone app, allowing the new Siri to search the web, generate content, including images, provide coding assistance, summarize and analyze information, as well as upload files. It will be able to use personal data to complete tasks and sport a substantially improved search feature. Apple is also designing a feature that will let the Siri chatbot view open windows and on-screen content, as well as adjust device features and settings. For understandable reasons, Gurman's report created a some consternation among privacy-valuing tech enthusiasts. Even so, it was still possible to brush this tidbit under the proverbial table, given its unofficial status. Now, however, comments within Google's latest earnings call appear to be hinting at a much more expansive role for Google's cloud infrastructure within Apple's AI strategy. For one, Google has apparently become Apple's "preferred cloud provider," which is a term that goes well beyond merely hosting iCloud data. Of course, Mark Gurman has attempted to explain away the inherent dichotomy between valuing privacy and then overly relying on a third-party cloud service provider by theorizing that Apple might leverage its Private Cloud Compute framework for on-device Apple Foundation models and other AI tasks, while relying on Google's infrastructure for Siri-related inferences. However, as the new Siri gains an overarching importance with the iOS 27, possibly relegating Apple Foundation models to an afterthought, one could argue that Apple's Private Cloud Compute framework is itself becoming a victim to expediency.
[11]
Siri's Big Upgrade: Apple and Google's AI Partnership Explained
Apple has taken a significant step forward in the realm of artificial intelligence by partnering with Google to enhance Siri through the integration of the Gemini AI model. This collaboration is designed to address long-standing criticisms of Siri's limited functionality while maintaining Apple's steadfast commitment to user privacy. By using Google's advanced AI technology, Apple aims to transform Siri into a more intelligent, versatile, and user-friendly virtual assistant capable of meeting the evolving demands of modern users. The video below from GregsGadgets gives us more details on what Apple has planned for Siri. Siri, once a pioneer in the virtual assistant space, has faced increasing challenges in keeping up with competitors like Google Assistant and Amazon Alexa. Despite its early success, Siri's capabilities have often been criticized for lagging behind industry standards. Apple's strong focus on privacy, while commendable, has historically limited its ability to develop AI models as sophisticated as those of its rivals. This partnership with Google represents a pragmatic shift, allowing Apple to integrate innovative AI technology while preserving its core privacy principles. By addressing these limitations, Apple is positioning Siri to reclaim its status as a leader in virtual assistance. At the heart of this collaboration lies Google's Gemini AI model, a state-of-the-art technology known for its advanced natural language processing and contextual understanding. Gemini enables Siri to perform more complex tasks with greater accuracy and efficiency. For example, users will be able to ask Siri to analyze an email, extract key dates, and update their calendar -- all in a single command. This level of multi-app integration aligns with Apple's vision of creating a virtual assistant that not only understands user needs but also acts on them seamlessly. The Gemini AI model also enhances Siri's ability to interpret nuanced language, making interactions more conversational and intuitive. This improvement is expected to significantly enhance the user experience, making Siri a more proactive and reliable tool in daily life. The upgraded Siri introduces a range of new features designed to improve its functionality and usability. These enhancements aim to make Siri a more capable and proactive assistant. Key features include: These features are designed to shift Siri from being a reactive assistant to a proactive tool that anticipates and addresses user needs efficiently. Despite incorporating Google's AI technology, Apple remains unwavering in its dedication to user privacy. Siri will not share personal data with Google or use it to train external AI models. All data processing will occur securely on-device or within Apple's ecosystem, making sure that user information remains private and under their control. This privacy-first approach continues to set Apple apart in an industry often scrutinized for data misuse. By maintaining these stringent privacy standards, Apple reinforces its reputation as a trusted technology provider. While this collaboration marks a significant milestone, it also highlights some challenges for Apple. The reliance on Google's AI technology underscores Apple's ongoing difficulties in independently developing competitive AI models. Additionally, balancing the integration of advanced AI capabilities with strict privacy standards presents a complex technical challenge. However, this partnership demonstrates Apple's willingness to adapt and prioritize user experience, even if it requires seeking external expertise. By embracing this collaborative approach, Apple is taking a bold step toward redefining Siri's role in the AI landscape. For users, these advancements translate to a smarter, more capable Siri that can handle a broader range of tasks with greater efficiency. Whether it's managing schedules, integrating health data, or providing contextual assistance, the enhanced Siri promises to make daily interactions with technology more seamless and intuitive. Importantly, these improvements come without compromising the privacy and security that Apple users have come to expect. The integration of Gemini AI ensures that Siri remains a reliable and secure assistant, tailored to meet the needs of its users. The beta release of the upgraded Siri is anticipated in the coming months, signaling the beginning of a new era for Apple's virtual assistant. Beyond the initial enhancements, Apple plans to deepen the integration of AI across its ecosystem. Future applications could include personalized health insights, proactive reminders, and even predictive assistance based on data from Apple's Health app. These developments aim to position Siri as a central component of Apple's ecosystem, offering users a more connected and intelligent experience. As Apple continues to refine and expand Siri's capabilities, the collaboration with Google serves as a testament to its commitment to innovation. By combining advanced AI technology with an unwavering focus on privacy, Apple is setting a new standard for virtual assistants. The enhanced Siri is poised to not only meet but exceed user expectations, paving the way for a smarter, more secure, and more intuitive digital future. Enhance your knowledge on Siri AI Enhancement by exploring a selection of articles and guides on the subject.
Share
Share
Copy Link
Apple and Google announced a multiyear partnership to power Siri with Gemini AI technology, but conflicting statements from CEOs Tim Cook and Sundar Pichai have created confusion about where the assistant will run and what it means for user privacy. With 2.5 billion active Apple devices at stake, analysts are demanding clarity on server infrastructure, data access, and the technical details both companies have been reluctant to share.
The Apple Google partnership to integrate Gemini AI into Siri has become one of the most scrutinized AI collaborations in recent memory, with investors and analysts demanding answers about technical implementation and user privacy concerns. During Alphabet's Q4 2025 earnings call, the first since the partnership announcement, CEO Sundar Pichai described Google as Apple's "preferred cloud provider" for developing Apple Foundation Models based on Gemini technology
2
. This statement immediately sparked debate about whether Gemini-powered Siri would run on Google's servers, contradicting Apple's earlier assurances about maintaining its privacy standards through on-device processing and Private Cloud Compute1
.
Source: Wccftech
Tim Cook addressed the partnership structure during Apple's Q1 2026 earnings call, stating that Apple determined "Google's AI technology would provide the most capable foundation for Apple Foundation Models" and emphasized that the company would "continue to run on the device and run in Private Cloud Compute and maintain our industry-leading privacy standards"
4
. However, neither Cook nor Pichai has provided definitive answers about the server infrastructure, creating what analysts describe as "seemingly contradictory" statements from both executives.The partnership represents a significant shift for Apple Intelligence, which encompasses Siri along with AI writing tools and Genmoji features. With 2.5 billion active Apple devices in the ecosystem, the deal gives Google unprecedented scale for its Gemini technology, even if the search giant doesn't receive direct access to user data
1
. Gil Luria, managing director at D.A. Davidson, noted that Google could gain "critical mass" and potentially observe query patterns to train its AI models, providing valuable insights without compromising individual privacy1
.
Source: Macworld
During the earnings call, analysts sought clarity on monetization and return on investment for the multiyear partnership. Cook framed Apple Intelligence as "a feature integrated across Apple's platforms rather than a discrete revenue driver," stating that bringing intelligence to what people love "creates great value" and "opens up a range of opportunities across our products and services"
4
. Neither company disclosed financial details of the arrangement, leaving Wall Street to speculate about revenue-sharing models and the true cost of this AI collaboration1
.The vague statements from both companies have fueled speculation about a phased rollout with different privacy implementations. Bloomberg reported that the companies are discussing hosting conversational chatbot features directly on Google servers running tensor processing units, while "the more immediate Siri update" would operate on Apple's own Private Cloud Compute servers
2
. This suggests Apple might be treating Siri and Apple Intelligence as separate systems with distinct server infrastructure approaches.
Source: Geeky Gadgets
The first phase of changes is expected to arrive with iOS 26.4, currently slated for beta testing in February, focusing on improving Siri's on-screen awareness and app control capabilities
5
. This hybrid approach aims to balance Google's generative AI capabilities with Apple's commitment to privacy standards. A more ambitious second phase, potentially debuting around iOS 27 and showcased at WWDC later in 2026, would transform Siri into a true conversational digital assistant capable of complex, multi-step requests and contextual dialogue5
.Related Stories
The technical details both companies refuse to confirm suggest ongoing negotiations about infrastructure and data handling. Cook told CNBC that the Gemini-powered Siri will be personalized for users but "won't know users' Gmail and stuff like that," promising to share more details when the updated assistant launches
1
. Whether Google's role as a cloud provider means leasing hardware to Apple or actually processing user data on its own server infrastructure remains the central unanswered question.This partnership marks a rare moment of pragmatism from Apple, which typically maintains a "we-do-everything-in-house" approach to product development
5
. By leveraging Gemini's existing infrastructure and advanced AI models, Apple is accelerating its efforts to overhaul Siri's capabilities and compete with ChatGPT and other generative AI assistants. The success of this collaboration will depend on whether Apple can maintain user trust while delivering the conversational intelligence iPhone and Mac users have been demanding for years.Summarized by
Navi
[4]
[5]
07 Nov 2025•Business and Economy

23 Aug 2025•Technology

12 Jan 2026•Technology

1
Policy and Regulation

2
Technology

3
Technology
