2 Sources
2 Sources
[1]
Why Apple's iOS 26.4 Siri Upgrade Will Be Bigger Than Originally Promised
In the iOS 26.4 update that's coming this spring, Apple will introduce a new version of Siri that's going to overhaul how we interact with the personal assistant and what it's able to do. The iOS 26.4 version of Siri won't work like ChatGPT or Claude, but it will rely on large language models (LLMs) and has been updated from the ground up. The next-generation version of Siri will use advanced large language models, similar to those used by ChatGPT, Claude, and Gemini. Apple isn't implementing full chatbot interactions, but any upgrade is both better than what's available now and long overdue. Right now, Siri uses machine learning, but it doesn't have the reasoning capabilities that LLM models impart. Siri relies on multiple task-specific models to complete a request, going from one step to another. Siri has to determine the intent of a request, pull out relevant information (a time, an event, a name, etc), and then use APIs or apps to complete the request. It's not an all-in-one system. In iOS 26.4, Siri will have an LLM core that everything else is built around. Instead of just translating voice to text and looking for keywords to execute on, Siri will actually understand the specifics of what a user is asking, and use reasoning to get it done. Siri today is usually fine for simple tasks like setting a timer or alarm, sending a text message, toggling a smart home device on or off, answering a simple question, or controlling a device function, but it doesn't understand anything more complicated, it can't complete multi-step tasks, it can't interpret wording that's not in the structure it wants, it has no personal context, and it doesn't support follow-up questions. An LLM should solve most of those problems because Siri will have something akin to a brain. LLMs can understand the nuance of a request, suss out what it is someone actually wants, and take the steps to deliver that information or complete the requested action. We already know some of what LLM Siri will be able to do because Apple described the Apple Intelligence features it wants to implement when iOS 18 debuted. Apple described three specific ways that Siri will improve, including personal context, the ability to see what's on the screen to know what the user is talking about, and the capability to do more in and between apps. Siri will understand pronouns, references to content on the screen and in apps, and it will have a short-term memory for follow-up requests. With personal context, Siri will be able to keep track of emails, messages, files, photos, and more, learning more about you to help you complete tasks and keep track of what you've been sent. Onscreen awareness will let Siri see what's on your screen and complete actions involving whatever you're looking at. If someone texts you an address, for example, you can tell Siri to add it to their contact card. Or if you're looking at a photo and want to send it to someone, you can ask Siri to do it for you. Deeper app integration means that Siri will be able to do more in and across apps, performing actions and completing tasks that are just not possible with the personal assistant right now. We don't have a full picture of what Siri will be capable of, but Apple has provided a few examples of what to expect. In an all-hands meeting in August 2025, Apple software engineering chief Craig Federighi explained the Siri debacle to employees. Apple had attempted to merge two separate systems, which didn't work out. There was one system for handling current commands and another based on large language models, and the hybrid approach was not working due to the confines of the current Siri architecture. The only way forward was to upgrade to the second-generation architecture built around a large language model. In the August meeting, Federighi said Apple had successfully revamped Siri, and that Apple would be able to introduce a bigger upgrade than it promised in iOS 18. "The work we've done on this end-to-end revamp of Siri has given us the results we needed," Federighi told employees. "This has put us in a position to not just deliver what we announced, but to deliver a much bigger upgrade than that we envisioned." Part of Apple's problem was that it was relying on AI models that it built in-house, and that were not able to match the capabilities of competitors. Apple started considering using a third-party model for Siri and other future AI features shortly after delaying Siri, and in January, Apple announced a multi-year partnership with Google. For the foreseeable future, Apple's AI features, including the more personalized version of Siri, will use a custom model Apple built in collaboration with Google's Gemini team. Apple plans to continue work on its own in-house models, but for now, it will rely on Gemini for many public-facing features. Siri in iOS 26.4 will be more similar to Google Gemini than Siri today, though without full chatbot capabilities. Apple plans to continue to run some features on-device and use Private Cloud Compute to maintain privacy. Apple will keep personal data on-device, anonymize requests, and continue to allow AI features to be disabled. Siri is not going to work as a chatbot, so the updated version will not feature long-term memory or back-and-forth conversations, plus Apple plans to use the same voice-based interface with limited typing functionality. In what became an infamous move, Apple went all-in showing off a smarter, Apple Intelligence-powered version of Siri when it introduced iOS 18 at the 2024 Worldwide Developers Conference. Apple said these features would come in an update to iOS 18, but right around when launch was expected, Apple admitted that Siri wasn't ready and would be delayed until spring 2026. Apple executives went on a press tour to explain the Siri shortcomings after WWDC 2025, promising bigger and better things for iOS 26, and explaining what went wrong. The Apple Intelligence Siri features we saw at WWDC 2024 were actually implemented and weren't faked, but Siri wasn't working as well as expected behind the scenes and Apple was dealing with quality issues. Since Apple advertised the new Siri features with the iPhone 16, some people who bought the iPhone because of the new functionality were upset about the delay and sued. Apple was able to quietly settle the case in December 2025, so most of the Siri snafu has been resolved. The misstep with Siri's debut and the failure of the hybrid architecture led Apple to restructure its entire AI team. Apple AI chief John Giannandrea was removed from the Siri leadership team, with Vision Pro chief Mike Rockwell taking over instead. Apple CEO Tim Cook was no longer confident in Giannandrea's ability to oversee product development, and Giannandrea is set to retire in spring 2026. Rockwell reports to Federighi, and Federighi told employees that the new leadership has "supercharged" Siri development. Federighi has apparently played an instrumental role in changing Apple's approach to AI, and he is making the decisions that will allow the company to catch up to rivals. Apple has struggled with retaining AI employees amid the Siri issue and recruitment strategies from companies like Meta. Meta poached several key AI engineers from Apple, offering pay packages as high as $200 million. At Apple's August all-hands meeting, Cook and Federighi aimed to reassure employees that AI is critically important to the company. "There is no project people are taking more seriously," Federighi said of Siri. Cook said that Apple will "make the investment" to be a leader in AI. Apple has promised that the new version of Siri is coming in spring 2026, which is when we're expecting iOS 26.4. Testing on iOS 26.4 should begin in late February or early March, with a launch to follow around the April timeframe. The new version of Siri will presumably run on all devices that support Apple Intelligence, though Apple hasn't explicitly provided details. Some new Siri capabilities may come to older devices as well. Apple plans to upgrade Siri even further in the iOS 27 update, turning Siri into a chatbot. Siri will work like Claude or ChatGPT, able to understand and engage in back and forth conversation. Details about the Siri interface and how a chatbot version of Siri will work are still in short supply, but iOS 26.4 will be a stop on the path to a version of Siri able to actually function like products from Anthropic and OpenAI.
[2]
Siri Finally Gets a Brain: How iOS 26.4 Brings Google Gemini AI to Your iPhone
Apple is preparing to redefine the capabilities of Siri with the integration of Google's Gemini AI, a significant advancement in conversational artificial intelligence. Set to debut with iOS 26.4 in early 2026, this update is expected to make Siri smarter, more intuitive, and capable of managing complex tasks. By combining innovative natural language processing with Apple's steadfast commitment to privacy, this development could transform how you interact with your devices on a daily basis. The video below from Matt Talks Tech gives us more details about iOS 26.45 and the new Siri powered by Gemini. At the heart of this upgrade lies Gemini AI, Google's advanced language model designed to excel in understanding nuanced queries, maintaining context throughout extended interactions, and executing multi-step tasks. Unlike traditional AI systems, Gemini AI brings a level of conversational depth and functionality that has the potential to elevate Siri from a basic voice assistant to a dynamic, human-like digital companion. For instance, instead of issuing a simple command like "set a reminder," you could say, "Remind me to call Sarah about the budget report after my 3 PM meeting." Powered by Gemini AI, Siri would not only understand the context but also cross-reference your schedule and set the reminder accordingly. This enhanced functionality aims to make your interactions with Siri more seamless, intuitive, and productive. While the collaboration between Apple and Google may raise questions, Apple remains unwavering in its commitment to user privacy. The integration of Gemini AI into Siri is designed to enhance intelligence without compromising security. Apple ensures that most tasks are processed directly on your device, reducing reliance on cloud-based systems. For scenarios requiring cloud interactions, such as syncing across devices, all data is encrypted to safeguard your privacy. This approach aligns with Apple's broader philosophy of delivering advanced technology while maintaining user trust. By prioritizing on-device processing and encryption, Apple minimizes risks such as data breaches or unauthorized access, making sure that you can embrace these new capabilities with confidence. The release of iOS 26.4, scheduled for April 2026, will mark the first phase of Siri's transformation. This update focuses on making Siri more context-aware and capable of understanding your intent based on the content on your screen. Here's what you can expect: These enhancements signify a shift from basic voice commands to a more dynamic, context-driven assistant, making Siri an integral part of your daily workflow. The evolution of Siri will continue with iOS 27, expected to launch in late 2026. This update will introduce even more advanced features, focusing on allowing back-and-forth dialogues and handling complex, multi-step tasks. For example, you could say, "Find a time for a team meeting next week, summarize the agenda, and send invites." Siri, using Gemini AI, would not only understand the request but also execute it across multiple apps, saving you time and effort. This progression positions Siri as a proactive assistant capable of streamlining your daily activities and enhancing productivity. As artificial intelligence becomes increasingly integrated into your life, privacy and security remain paramount. Apple's privacy-first approach ensures that the benefits of advanced AI do not come at the expense of your trust. Here's how Apple safeguards your data: This dual focus on innovation and security ensures that you can enjoy the benefits of a smarter Siri without compromising your personal information. Apple's rollout of these updates will follow a structured timeline to ensure a smooth transition and optimal user experience: This phased approach allows Apple to introduce fantastic features while maintaining stability and reliability, making sure that users can adapt to these advancements seamlessly. For you, these updates mean a more capable and intuitive assistant that adapts to your needs. Whether you're managing tasks, communicating with others, or seeking information, Siri's enhanced functionality will make your interactions more efficient and enjoyable. This transformation addresses long-standing limitations, positioning Siri as a leader in the competitive landscape of AI-driven assistants. By evolving from a basic tool to a proactive, human-like assistant, Siri has the potential to redefine how you interact with technology and streamline your daily life. The integration of Gemini AI into Siri represents a significant leap forward in conversational AI. With a focus on natural language understanding, privacy, and seamless app integration, these updates promise to make Apple devices smarter, more intuitive, and user-friendly. As iOS 26.4 and iOS 27 roll out, you can look forward to a Siri that not only understands you better but also helps you achieve more with less effort. This marks the beginning of a new era for digital assistants, where technology works seamlessly to enhance your productivity and simplify your life. Below are more guides on the Siri upgrade from our extensive range of articles.
Share
Share
Copy Link
Apple is set to launch a completely rebuilt Siri in iOS 26.4 this spring, powered by Google Gemini AI through a multi-year partnership announced in January. The upgraded Siri powered by large language models will understand context, handle multi-step tasks, and deliver deeper app integration—capabilities Apple's software chief Craig Federighi says exceed original promises made in iOS 18.
Apple is preparing to introduce a fundamentally redesigned Siri with iOS 26.4 this spring, marking one of the most significant updates to the voice assistant capabilities since its debut. The upgraded Siri powered by large language models represents a complete architectural overhaul, built around an LLM core rather than the fragmented, task-specific models that have limited Siri's functionality for years
1
. At the heart of this transformation lies an unexpected Apple and Google partnership announced in January, which will see Apple's AI features rely on a custom model built in collaboration with Google's Gemini team for the foreseeable future1
.
Source: Geeky Gadgets
The decision to partner with Google Gemini came after Apple struggled with its own in-house AI models, which couldn't match competitor capabilities. During an all-hands meeting in August 2025, Apple software engineering chief Craig Federighi explained that Apple had attempted to merge two separate systems—one for handling current commands and another based on LLMs—but the hybrid approach failed due to the confines of the current Siri architecture
1
. Federighi told employees that Apple had successfully revamped Siri and would "deliver a much bigger upgrade than that we envisioned" beyond what was announced with iOS 181
.The current version of Siri relies on machine learning but lacks the reasoning capabilities that LLMs provide. It uses multiple task-specific models to complete requests, moving from one step to another—determining intent, extracting relevant information like times or names, then using APIs or apps to execute
1
. While adequate for simple tasks like setting timers or toggling smart home devices, Siri struggles with anything more complex, can't handle multi-step tasks, doesn't understand varied wording structures, lacks personal context, and doesn't support follow-up questions1
.
Source: MacRumors
In iOS 26.4, Siri will function as a context-aware digital assistant with an LLM core that everything else builds around. Instead of translating voice to text and searching for keywords, Siri will understand the specifics of what users ask and use reasoning to accomplish tasks
1
. Google Gemini AI brings advanced natural language processing designed to excel at understanding nuanced queries, maintaining context throughout extended interactions, and executing multi-step tasks2
. For instance, instead of a simple command, users could say "Remind me to call Sarah about the budget report after my 3 PM meeting," and Siri would understand the context, cross-reference the schedule, and set the reminder accordingly2
.Apple outlined three specific improvements for Siri AI integration when iOS 18 debuted: personal context, onscreen awareness, and deeper app integration
1
. With personal context, Siri will track emails, messages, files, photos, and more, learning about users to help complete tasks and monitor what they've been sent. Onscreen awareness enables Siri to see what's on the iPhone screen and complete actions involving whatever users are viewing—if someone texts an address, users can tell Siri to add it to their contact card, or ask Siri to send a photo they're currently viewing1
.Deeper app integration means Siri will perform actions and complete tasks across apps that aren't possible with the current digital assistant. Siri will understand pronouns, references to content on screen and in apps, and maintain short-term memory for follow-up requests
1
. The iOS 26.4 release scheduled for April 2026 will mark the first phase of this transformation, focusing on making Siri more context-aware and capable of understanding user intent based on screen content2
.Related Stories
While the collaboration between Apple and Google may raise questions about user privacy, Apple maintains its commitment to security through on-device processing and encryption
2
. Most tasks are processed directly on the iPhone, reducing reliance on cloud-based systems. For scenarios requiring cloud interactions such as syncing across devices, all data is encrypted to safeguard privacy2
. This approach aligns with Apple's broader philosophy of delivering advanced technology while maintaining user trust, minimizing risks like data breaches or unauthorized access2
.The evolution won't stop with iOS 26.4. iOS 27, expected in late 2026, will introduce even more advanced features focusing on back-and-forth dialogues and handling complex tasks
2
. Users could request "Find a time for a team meeting next week, summarize the agenda, and send invites," and Siri would understand and execute across multiple apps2
. While Apple isn't implementing full chatbot interactions like ChatGPT or Claude, the new version will be more similar to Google Gemini than current Siri, using AI to deliver a more capable and intuitive assistant that adapts to user needs1
2
. Apple plans to continue work on its own in-house models while relying on Gemini for many public-facing features, addressing long-standing limitations and positioning Siri as a leader in the competitive landscape of AI-driven assistants1
.Summarized by
Navi
21 Jan 2026•Technology

22 Nov 2024•Technology

11 Jun 2025•Technology

1
Policy and Regulation

2
Technology

3
Technology
