50 Sources
50 Sources
[1]
Apple will reportedly unveil its Gemini-powered Siri assistant in February
We're about to get our first real look at the results of the recently announced AI partnership between Apple and Google, according to Bloomberg's Mark Gurman. Gurman reports that Apple is planning to announce a new version of Siri in the second half of February. Using Google's Gemini AI models, this Siri update will reportedly be the first to live up to the promises Apple made in June 2024, with the ability to complete tasks by accessing user's personal data and on-screen content. And that's ahead of an even bigger upgrade that Apple plans to announce in June, at its Worldwide Developers Conference, Gurman says. This version of Siri is supposed to be more conversational, in the style of other chatbots like ChatGPT, and it could run directly on Google's cloud infrastructure. Earlier reports suggested that Apple has been struggling to get its AI strategy back on-track. In fact, Gurman says Apple's Mike Rockwell told foundation team members over the summer that one of Gurman's earlier reports was "bulls-t." But with the Google partnership, as well as the recent departure of Apple's AI chief John Giannandrea, it seems the company has indeed found a new direction.
[2]
Apple plans to make Siri an AI chatbot, report says
Apple's long awaited Siri revamp could turn the smart assistant into a chatbot more akin to ChatGPT, according to a report from Bloomberg's Mark Gurman. His sources say that the Siri chatbot, which would be integrated into iOS 27, could be the focal point of Apple's WWDC presentation in June. The Siri chatbot, internally codenamed "Campos," will work with both voice and text inputs. Apple senior vice president Craig Federighi had previously stated that he didn't want Siri to be a chatbot, and rather, that he wanted Apple's AI options to be "integrated so it's there within reach whenever you need it." But it appears that the game plan has changed amid increased pressure from the success of other companies' AI chatbots. Apple may also sense a potential threat because OpenAI is planning to enter the hardware space, led by none other than former Apple design head Jony Ive. It's no secret that Apple has been lagging behind in the AI race. The company delayed its rollout of a "more personalized Siri" multiple times and spent last year shopping around for an AI partner, testing out the technology of competitors like OpenAI and Anthropic for a potential deal. Ultimately, Apple chose Google's Gemini as its AI partner, which the two tech giants confirmed earlier this month.
[3]
Siri Reinvented as a ChatGPT Rival? The Rumors Are Getting a Lot Louder
Macy has been working for CNET for coming on 2 years. Prior to CNET, Macy received a North Carolina College Media Association award in sports writing. Apple is reportedly preparing its most dramatic overhaul of Siri to date, with plans to turn the voice assistant into a full-fledged AI chatbot as early as this coming fall. According to reporting from Bloomberg's Mark Gurman, Apple is working on a revamped version of Siri for iOS 27 and MacOS 27 that would allow the assistant to engage in more conversational, open-ended interactions -- similar to OpenAI's ChatGPT. While speculation around a more capable, chat-style Siri has circulated for a while now, Gurman's reporting adds new weight to the idea that Apple is finally ready to make the leap. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. The changes would mark a major shift for Apple, which has been criticized for lagging behind competitors in generative AI capabilities. Unlike today's Siri -- which focuses on commands, quick answers and system controls -- the new version is rumored to behave more like a true chatbot that's capable of sustained conversations, complex queries and deeper contextual understanding. Last week, Apple revealed that it plans to use Google's Gemini AI models to help power Siri, a move that signaled a more pragmatic approach to competing in the fast-moving AI landscape. By leaning on Gemini's prowess while reshaping Siri's interface and behavior, Apple appears to be positioning itself to better compete with ChatGPT and other AI-first assistants without fully building everything from scratch. For now, Apple hasn't confirmed the latest on its chatbot plans. Still, if the report proves accurate, it would represent Apple's most aggressive push yet into generative AI and a tacit admission that incremental Siri updates are no longer enough in an era defined by conversational AI. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)
[4]
Apple is turning Siri into an AI bot that's more like ChatGPT
Apple is planning a big Siri overhaul that will transform the voice assistant into an AI chatbot built directly into its iPhone and Mac, according to Bloomberg reporter Mark Gurman. The update is reportedly coming later this year and will replace the existing Siri interface, allowing users to interact with the assistant by both typing and talking, similar to the chatbots available from Google, OpenAI, Anthropic, and others. As noted by Bloomberg, this change is separate from the AI-powered personalization features coming to Siri in the next few months. Like that updated version of Siri, the chatbot will reportedly use a custom Google Gemini AI model as part of the multi-year partnership both companies announced earlier this year, though this version will have capabilities that "significantly surpass" its AI personalization features, Bloomberg reports. Apple reportedly plans to reveal Siri's AI update, codenamed Campos, during its Worldwide Developers Conference in June before launching it in September. It will be the "primary new addition" to iOS 27, iPadOS, and macOS 27, with other updates focused mostly on stability, writes Gurman.
[5]
This rumored Siri upgrade could finally change how we use iPhones - but Apple needs to deliver
Follow ZDNET: Add us as a preferred source on Google. ZDNET key takeaways * Apple could transform Siri into an AI-powered bot that's baked into its operating systems. * Unlike ChatGPT and Gemini, it won't be an app-based conversational AI. * Apple plans to announce it at the yearly WWDC in June. Apple is planning to turn Siri into an AI chatbot like Google Gemini and ChatGPT. However, unlike its more popular rivals, Apple's version will be built directly into its iPhone, iPad, and Mac. This change is separate from the long-overdue revamped Siri with personalization features, which was scheduled to roll out with iOS 26 but got delayed to 2026. According to a Bloomberg report from Mark Gurman, the iPhone maker is going to replace the existing Siri interface with an AI chatbot, codenamed Campos. It is said to have capabilities that "significantly surpass" the AI personalization features. The upcoming Siri AI bot will be embedded deeply across Apple operating systems and will allow you to search the web for information, summarize it, generate images, create content, and analyze uploaded files. It is also reported to draw on your personal data to complete tasks, locate specific files, songs, calendar events, and text messages. Also: Can Google save Apple AI? Gemini to power a new, personalized Siri OnePlus, Oppo, Vivo, and other Chinese smartphones offer similar universal search functionality, while Google Gemini and OpenAI's ChatGPT can help you with the other stuff - but like everything AI, they aren't always reliable. As per the report, the upcoming Siri AI chatbot is designed to control device features and settings, as well as analyze open windows and on-screen content in order to take actions and suggest commands. Unlike third-party chatbots running on Apple devices, this version of Siri AI will be integrated into the company's core apps, such as Photos, Mail, Apple Music, Podcasts, TV, and Xcode. This is aimed at enabling a voice-first user interaction. Gurman explains, you could "ask Siri to find a photo based on a description of its contents and edit it with specific preferences -- like cropping and color changes... or write a message to a friend about upcoming calendar plans." It'll use a custom Google Gemini AI model as part of the recent multi-year partnership between Apple and Google. Also: How I used ChatGPT's $20 Plus plan to fix a nightmare bug - fast However, the Cupertino-based company is facing one issue regarding how much its chatbot can remember about its users. Apple is said to sharply limit this capability in the interest of privacy. For context, ChatGPT and other rivals can retain extensive memory of past interactions and use that context in future conversations to fulfill requests. Apple is reportedly set to announce a new Siri AI bot update during its Worldwide Developers Conference in June, before launching it in September. Gurman says it'll be the "primary new addition" to iOS 27, iPadOS, and macOS 27, with Apple focusing primarily on stability and bug fixes.
[6]
Gemini-Powered Siri Reportedly Set to Arrive on iPhones Next Month
Apple announced an AI partnership with Google earlier this month, and the first results of the collaboration will become available through an updated Siri next month, Bloomberg's Mark Gurman reports. Apple has had a tough time with Siri over the past few years. The personalized Siri it promised at WWDC 2024, giving the assistant the ability to tap into a user's personal data or take actions across apps autonomously, has been long overdue. Following delays, the company vowed to release those new features this year. To live up to that promise, Apple desperately needed Google's Gemini models, Gurman says. The company's software leadership had admitted last year that the delays were due to struggles with Siri's underlying architecture. All of that is in the past now. We'll get to see demos of Gemini-powered Siri as early as the second half of February, Gurman says. Apple may not hold a special event for it, but instead show off the new Siri capabilities at a controlled media briefing in New York. Interestingly, shortly after the AI partnership announcement, Google launched Personal Intelligence for the Gemini app. It gives the chatbot the ability to tap into a user's Gmail, YouTube, Search, and Photos history and deliver more personalized responses. A slightly more limited version of Personal Intelligence, able to draw only from Gmail and Photos history, rolled out to AI Mode in Search last week. It would not be surprising if Apple demos a version of Gemini's Personal Intelligence on Siri in its rumored February announcement. The timeline, on the other hand, aligns with a previous Bloomberg report that suggested Apple was targeting iOS 26.4 for the launch of Siri's delayed features. Gurman now says the beta version of the software is expected to come out in late February, while the stable version is expected to arrive in March or April. After the February announcement, Apple is expected to unveil an overhauled Siri for iOS, iPadOS, and macOS at WWDC 2026 in June. This refresh will turn Siri into a chatbot, enabling users to have ChatGPT-like back-and-forth conversations with it. Siri will also be able to fetch answers from the web, draft content, generate images, and analyze uploaded files, Gurman reported last week. If all of these rumors hold, Apple users will finally have something to cheer about Siri after complaining about it for years.
[7]
Apple to Revamp Siri as a Built-In iPhone, Mac Chatbot to Fend Off OpenAI
Apple Inc. plans to revamp Siri later this year by turning the digital assistant into the company's first artificial intelligence chatbot, thrusting the iPhone maker into a generative AI race dominated by OpenAI and Google. The chatbot -- code-named Campos -- will be embedded deeply into the iPhone, iPad and Mac operating systems and replace the current Siri interface, according to people familiar with the plan. Users will be able to summon the new service the same way they open Siri now, by speaking the "Siri" command or holding down the side button on their iPhone or iPad. The new approach will go well beyond the abilities of the current Siri -- or even a long-promised update that's coming earlier in 2026. Today's Siri lacks a chat-like feel and the back-and-forth conversational abilities of OpenAI's ChatGPT or Google's Gemini. The feature is a central piece of Apple's turnaround plan for the AI market, where it has lagged behind Silicon Valley peers. The Apple Intelligence platform had a rocky rollout in 2024, with features that were underwhelming or slow to arrive. Shares of Apple gained on the chatbot news, climbing as much as 1.7% to a session high of $250.83. Google parent Alphabet Inc., which is supplying the underlying technology for the project, was up 2.6% to $330.32 as of 2:54 p.m. in New York. The previously promised, non-chatbot update to Siri -- retaining the current interface -- is planned for iOS 26.4, due in the coming months. The idea behind that upgrade is to add features unveiled in 2024, including the ability to analyze on-screen content and tap into personal data. It also will be better at searching the web. The chatbot capabilities will come later in the year, according to the people, who asked not to be identified because the plans are private. The company aims to unveil that technology in June at its Worldwide Developers Conference and release it in September. Campos, which will have both voice- and typing-based modes, will be the primary new addition to Apple's upcoming operating systems. The company is integrating it into iOS 27 and iPadOS 27, both code-named Rave, as well as macOS 27, internally known as Fizz. Other than the chatbot interface, the operating systems aren't getting big changes this year. Apple is more focused on improving performance and fixing bugs. Last year, it rolled out a major design overhaul, unifying the look and feel of its operating systems. Internally, Apple is testing the chatbot technology as a standalone Siri app, similar to the ChatGPT and Gemini options available in the App Store. The company doesn't plan to offer that version to customers, though. Instead, it will integrate the software across its operating systems, like the Siri of today. A spokesperson for Cupertino, California-based Apple declined to comment. Embracing the chatbot approach represents a strategic shift for Apple, which has long downplayed the conversational AI tools popularized by OpenAI, Google and Microsoft Corp. Executives have argued that users prefer having AI woven directly into features -- something Apple has done with its writing tools, Genmoji emoji generator and notification summaries -- rather than standalone chat experiences. Craig Federighi, senior vice president of software engineering, said in a June interview with Tom's Guide that releasing a chatbot was never the company's goal. Apple didn't want to send users "off into some chat experience in order to get things done," he said. But Apple risked falling further behind rivals without its own chatbot. Samsung Electronics Co., Google and several Chinese smartphone makers have already embedded conversational AI deeply into their operating systems. Such tools have become increasingly essential, with ChatGPT surpassing 800 million weekly active users in October. OpenAI is poised to become more of an Apple competitor, adding further pressure. The maker of ChatGPT looks to evolve its software into an AI operating system. It's also working on new devices under the direction of former Apple design chief Jony Ive. The AI company has poached several dozen Apple engineers in recent months, a move that rankled the iPhone maker's executives and stoked concerns about OpenAI becoming a threat to its underlying business. Like ChatGPT and Google Gemini, Apple's chatbot will allow users to search the web for information, create content, generate images, summarize information and analyze uploaded files. It also will draw on personal data to complete tasks, being able to more easily locate specific files, songs, calendar events and text messages. Unlike third-party chatbots running on Apple devices, the planned offering is designed to analyze open windows and on-screen content in order to take actions and suggest commands. It will also be able to control device features and settings, allowing it to make phone calls, set timers and launch the camera. Get the Tech Newsletter bundle. Get the Tech Newsletter bundle. Get the Tech Newsletter bundle. Bloomberg's subscriber-only tech newsletters, and full access to all the articles they feature. Bloomberg's subscriber-only tech newsletters, and full access to all the articles they feature. Bloomberg's subscriber-only tech newsletters, and full access to all the articles they feature. Bloomberg may send me offers and promotions. Plus Signed UpPlus Sign UpPlus Sign Up By submitting my information, I agree to the Privacy Policy and Terms of Service. More significantly, Siri will be integrated into all of the company's core apps, including ones for mail, music, podcasts, TV, Xcode programming software and photos. That will allow users to do much more with just their voice. For instance, they could ask Siri to find a photo based on a description of its contents and edit it with specific preferences -- like cropping and color changes. Or a user could ask Siri within the email app to write a message to a friend about upcoming calendar plans. Campos may let Apple jettison its Spotlight function as well. That feature lets users search for content on their devices and look up a limited array of information, like sports scores and weather details. One issue under discussion is how much the chatbot will be allowed to remember about its users. ChatGPT and other conversational AI tools can retain an extensive memory of past interactions, allowing them to draw on conversations and personal details when fulfilling requests. Apple is considering sharply limiting this capability in the interest of privacy. The chatbot will feature an Apple-designed user interface but rely heavily on a custom AI model developed by the Google Gemini team -- an arrangement first reported by Bloomberg News last year. The iOS 26.4 update of Siri, the one before the true chatbot, will rely on a Google-developed system internally known as Apple Foundation Models version 10. That software will operate at 1.2 trillion parameters, a measure of AI complexity. Campos, however, will significantly surpass those capabilities. The chatbot will run a higher-end version of the custom Google model, comparable to Gemini 3, that's known internally as Apple Foundation Models version 11. In a potential policy shift for Apple, the two partners are discussing hosting the chatbot directly on Google servers running powerful chips known as TPUs, or tensor processing units. The more immediate Siri update, in contrast, will operate on Apple's own Private Cloud Compute servers, which rely on high-end Mac chips for processing. Apple is paying Google roughly $1 billion annually for access to the models. The company also may turn to Google technology to enhance existing Apple Intelligence features. Bloomberg first reported last June that Apple was mulling the use of outside models to fix its AI woes. Apple is designing Campos so that its underlying models can be swapped out over time. That means the company will have the flexibility to move away from Google-powered systems in the future if it so chooses. Apple has also tested the chatbot with Chinese AI models, signaling plans to eventually deploy the feature in that country, where Apple Intelligence isn't yet available. The next Siri upgrade and Campos will both include a feature called World Knowledge Answers, first reported by Bloomberg in September. It will provide web-summarized responses -- similar to Perplexity and ChatGPT -- along with citations. Hints of Apple's shift toward chatbots surfaced in recent months. Last year, the company internally developed an app dubbed Veritas that turned the new Siri engine into a text-based chatbot interface. The app was strictly for testing and isn't planned for public release. The strategic pivot follows Apple leadership changes. Longtime AI chief John Giannandrea was relieved of his role in December, with Federighi consolidating control over Apple's AI efforts. The company has also hired Amar Subramanya as a vice president of AI reporting to Federighi. He previously helped lead engineering for Gemini at Google.
[8]
Apple reportedly plans to reveal its Gemini-powered Siri in February
A new and improved Siri may finally make an appearance, but this time, it could be with a Google Gemini glow up. According to Bloomberg's Mark Gurman, Apple wants to announce a new Siri in "the second half of February" that will show off the results of its recently announced partnership with Google and offer demonstrations of the Gemini-powered capabilities. After this reveal, Gurman reported that the new Siri will make its way to iOS 26.4, which is also slated to enter beta testing in February before its public release in March or early April. Apple has been meaning to launch its next-gen Siri ever since its announcement at WWDC 2024, but now we know that this Gemini-powered Siri will behave more like an AI chatbot, similar to OpenAI's ChatGPT, thanks to another Bloomberg report from last week. Following the reported demo that's scheduled for late February, Gurman said Apple will have a grand reveal of the new Siri, which is currently codenamed Campos, at its annual developer conference in the summer. After that, the latest Siri and the accompanying Gemini-powered Apple Intelligence features are expected to arrive with iOS 27, iPadOS 27 and macOS 27, which are expected to be available as beta releases in the summer.
[9]
Siri is getting Gemini superpowers, and that could be awful for Android
At this point, it should probably be accepted as a universal fact that Siri is one of the weakest voice assistants in the world. That was true before its redesign in iOS 18, and it's still true even after Apple added ChatGPT. The new smarts may have given Siri a new brain, but Siri still hasn't figured out how to use that brain properly. We've collectively bullied Siri for years (rightfully so), to the point that Apple has now decided to partner with Google and base Siri on none other than its biggest -- and far better -- competitor, Gemini. There's something deeply funny about a company using its rival's underlying tech because it couldn't quite build its own or find a better alternative. But if you look past the humor, the Gemini-fication of Siri discounts one of the key reasons people still pick Android smartphones. That long-standing, lopsided advantage in Android's favor may finally be slipping away -- and this time, possibly for good. Siri is deeply integrated into Apple's broader ecosystem, and to be fair, it does some things really well. For instance, you can ask Siri how something is done on the iPhone, and it will often pull up information straight from official Apple support pages with precise, context-aware answers. I genuinely like Siri for this. And yet -- funnily enough -- I still find myself firing up Gemini on my iPhone via an on-screen widget rather than long-pressing the power button to wake Siri. Gemini isn't deeply integrated at the system level on iOS the way Siri is. Even so, in its current limited form, Gemini is miles ahead of Siri for anything that requires actual thinking. It's my default tool for morning news rundowns. It's where I go for everyday life hacks. Now, I've even started using it as a learning partner. And it does all of this without the friction and unpredictability that still define Siri. Unlike Siri, Gemini can't read the iPhone's system state and can't open apps or take actions across them. It feels like a layer sitting on top of iOS, rather than something baked in, unlike Gemini on Android or Siri on the iPhone. And despite all of these limitations, I still reach for Gemini more than Siri. That alone says a lot. I don't have a lot of confidence in Apple here, largely because we've already seen it fumble Siri even with ChatGPT in the mix. But if Apple manages to pull this off properly, a Gemini-powered Siri wouldn't just be a better Siri -- it would be a structural shift in how compelling the iPhone feels. Historically, Siri has been the weakest link in Apple's ecosystem. Since the early days of the Google Assistant, Android users have enjoyed a better voice assistant without constantly worrying that Apple would suddenly leapfrog them. Gemini only carried that legacy forward -- it's more conversational, understands context better, has strong reasoning abilities, and, crucially, is deeply tied into Android. Now, imagine that same intelligence arriving on the iPhone, paired with Apple's famously tight system-level integration. Think of it as a replacement for Siri that can take actions across iOS, understand device context, and hook directly into Apple's services. That's where the equation flips on its head. When we talk about Android versus iOS, the conversation usually circles around customization, flexibility, and choice -- all valid points in Android's favor. But Android's real soft advantage has always been Google's AI lead. Google never needed to loudly market that its assistant was better. You just knew it was. Anyone who has used Siri long enough knows how often it falls flat, and how instinctively you end up turning to Gemini instead. For a long time, that was enough to keep people on Android -- or at least make them actively dismiss iPhones for lacking a serious, reliable voice assistant. If Apple adopts Gemini in a meaningful way, that advantage evaporates. And there's a very real chance of that happening. Apple has a long track record of taking borrowed technology and turning it into polished, mainstream experiences. Touch ID, for instance, came from an acquired fingerprint sensor company and is still preferred by many over Face ID. Apple doesn't always invent first, but it refines, integrates, and scales better than almost anyone, Google included. So, if Gemini indeed gets more deeply embedded into iOS than it is across most Android devices, Apple could genuinely beat Google at its own AI game. And as AI increasingly becomes the thing that defines platform lock-in, that's a dangerous position for Android to be in. There's an important distinction to be made here: Siri using Gemini as an underlying language model is very different from Apple handing over the keys to its assistant experience. What's more likely is that Apple treats Gemini as a backend brain while keeping Siri's surface-level behavior largely intact. In that scenario, Siri would get smarter without becoming a full-fledged Gemini assistant. It'd be enough to satisfy longtime critics without directly threatening how well Gemini works as an assistant on Android. That would mean it's not Gemini coming to the iPhone -- it's Gemini powering Siri. My educated guess is that Siri will see meaningful improvements, with Gemini doing most of the heavy lifting behind the scenes. From there, the real question becomes strategic rather than technical. Google isn't dumb. It's a multi-trillion-dollar company, and it wouldn't casually hand one of its biggest competitive moats to a rival this large. Google has almost certainly gamed this out far more deeply than any of us can see from the outside. Making Gemini the default AI layer across billions of additional devices -- even ones Google doesn't directly control -- may outweigh the risk of weakening Android's differentiation. It's also possible Google believes model access alone is no longer the moat, and that the real advantage lies in how Gemini is woven into its services, data, and workflows on Android. Right now, though, it's still a muddy space with very little clarity. Whether Apple ends up trumping Google at its own game, or Google proves to be the more shrewd long-term strategist, is something we'll only know once Gemini-powered Siri actually shows up in public. I, for one, can't wait to put them side by side and see who wins.
[10]
New, Smarter Siri Is Reportedly Weeks from Arriving. It Had Better Be Amazing
Just after the start of 2026, Google parent Alphabet became more valuable than Apple for the first time since 2019, a technically meaningless milestone, but a symbolically powerful one. And it's still true that Apple is the less valuable company, and Google's AI partnership with Apple is perceived as a big part of why. Now, according to Bloomberg's machine gun of Apple scoops Mark Gurman, Apple is weeks away from demoing the product of that partnership: its revamped version of Siri. For Apple's sake, it had better not suck. Next month we should expect "demonstrations of the functionality" of Siri at some sort of Apple event, possibly a small one, Gurman says. This new Siri will be powered by a Google-built AI model, but Apple won't tip users off about that while they're using it. In fact, even internally it's called "Apple Foundation Models version 10," Gurman notes. This new Siri will, if all goes according to plan, just work a lot better than what iPhones and other Apple devices are currently armed with. Siri is perhaps best understood as the organizing "personality" of the Apple Home software and hardware ecosystem, and it's sorta... fine as a smart home assistant. It's comparable to similar products from Amazon and Google, with a few more tendencies that chafe slightly, like how it may respond to basic informational questions with info-dumps that start with phrases like "Here are two options!" Or it will just glitch out and say something like "Uh-oh! There's a problem." When used on an iPhone, Siri feels a little like having a smart home assistant in your pocket, which, why? If your phone is in your hand and you want to set a timer, you're looking right at the Clock app icon and you'll probably just use that. If you want something that can answer questions conversationally, you can just use a product like Claude, or ChatGPT, or Gemini, or, hell, Microsoft Copilot. With all that in mind, Gurman subtly describes the new version of Siri as a productivity beast. The new version "should be able to tap into personal data and on-screen content to fulfill tasks." That sounds nothing like the current iteration of Siri, which feels like a naive being called into existence in the moment with no context about what's going on. It would indeed be powerful to have a Siri that can respond nimbly to what the user is doing, and incorporate the data already in their phone to provide actual help. Based on descriptions like this, I can imagine looking at an event website, for instance, and saying "Hey Siri, do I have time for this?" and then getting a decent answer. And Gurman says this Siri will "be conversational, aware of relevant context and capable of sustained back-and-forth dialogue," which also means it's meant to take a real bite of the chatbot market. Where Siri once relied on ChatGPT (arguably too much), it will now, in theory, compete with it. But the Apple-Google partnership driving the new Siri is interesting trivia at best, and most people won't care or notice, because, as Gurman notes, "Apple is a product company," and "the provenance of the technology is mostly irrelevant." That means if the takeaway by March is "Lol, Siri still sucks," Apple is going to pay the price in terms of public perception, not Google. And Google gets its $1 billion either way.
[11]
Apple is reportedly overhauling Siri to be an AI chatbot
Apple has been spinning its wheels for many months over its approach to artificial intelligence, but a strategy finally appears to be emerging for the company. Bloomberg's Mark Gurman reported today that Apple's long-awaited Siri overhaul will allegedly involve transforming the voice assistant into an AI chatbot, internally called Campos. Sources have reportedly told Gurman that Apple chatbot will completely replace the current Siri interface in favor of a more interactive model similar to those used by OpenAI's ChatGPT and Google's Gemini. He also cited sources who claimed that while Apple has been testing a standalone Campos app, the company doesn't plan to release it for customers. Instead, the new chatbot will emphasize deep software integrations when it rolls out, reportedly as part of the iOS 27, iPadOS 27 and macOS 27 wave late next year. However, there will reportedly be a new features for the current iteration Siri coming in the iOS 26.4. Those additions will include the much-delayed updates Apple first promised for the platform back in 2024. Pivoting to a chatbot gives some additional context to Apple's recent move to collaborate with frequent rival Google; the companies announced earlier in January that Gemini models will be used to power the upcoming versions of Siri. Gemini has become ubiquitous in the Google ecosystem, and it makes sense for Apple to leverage outside help in this segment where it has already been trailing its competitors. Although Apple may not have a standalone app for its Siri chatbot, the company does appear to be considering new places to host its AI resource. Additional reports today claimed that 2027 could also see the release of a wearable AI pin.
[12]
Report: Apple planning to 'deeply' integrate Gemini-backed Siri into multiple core apps
Since the launch of Apple Intelligence over a year ago, Apple has added very few new AI features to its apps. Some recent additions are Apple Music's AutoMix and Apple Watch's Workout Buddy. However, that is set to change this year. According to Mark Gurman in today's Power On newsletter, Apple is planning to integrate the upcoming Gemini-backed version of Siri "deeply across its core apps". Moving to Gemini Previously, Apple's plans had been much more ambitious. For a while now, Apple has been working on a "World Knowledge Answers" project, which aimed to compete with the likes of ChatGPT and Perplexity. However, this project has been scaled back, as Apple works to instead replace existing technologies with Google's Gemini. The World Knowledge Answers project is not the only project to be scaled back. According to Gurman, Apple had been working on a AI overhaul of Safari: One major 2026 priority had been a fully revamped Safari browser built for the AI era, designed to counter new offerings from Perplexity and OpenAI. Planned features included assessing the trustworthiness of documents and data, and cross-referencing information across multiple sources. These plans are also now paused, though Gurman says there is a chance for the work to resume prior to WWDC26. Apple has also "returned to the drawing board" for AI Health features, Gurman reports. Integrating Siri across core apps According to today's Power On, Apple had "envisioned standalone chatbot-style experiences embedded in apps such as Safari, TV, Health, Music and Podcasts". But now, following the departure of SVP of Machine Learning and AI Strategy John Giannandrea, Apple is planning a more cohesive experience. Rather than having separate chatbots across multiple apps, they will instead "deeply" integrate the upcoming Gemini-based Siri across their core apps. Unfortunately, we don't yet know what this will look like, but we may not have to wait long to find out. Gurman also reports that the new Siri will be announced as soon as next month. In the past, Apple has also shown off how Siri might integrate with apps through App Intents. An exciting year ahead Though it has certainly been a long wait, 2026 is shaping up to be an exciting year for Apple Intelligence. Apple is expected to finally launch an overhauled Siri, first shown off at WWDC24. What features are you most looking forward to? Let us know in the comments!
[13]
Apple just weeks from unveiling Google-powered Siri makeover, report claims
The upgrade promises conversational abilities and contextual data usage, though running on Google's cloud infrastructure raises potential privacy concerns. Earlier this month, Apple and Google issued a joint statement announcing a "multi-year collaboration" to base Siri and other Apple AI products on Google Gemini technology. But more details of the partnership, including the likely timeframe, have now been revealed by a new report. In the latest edition of his Power On newsletter, Bloomberg reporter Mark Gurman claims Apple "appears to be less than a month away from unveiling the results of this partnership." He isn't sure if this announcement will take the form of a full press event, acknowledging that the company may choose instead to brief journalists individually, but either way, the information should be out there in a matter of weeks. According to Gurman, the Siri announcement will mostly consist of what we've heard before. Apple will tell us that, finally fulfilling a promise made at WWDC 2024 and later repeatedly delayed, Siri will soon be able to use contextual data-other information on the screen, and everything it knows about the user-to more accurately execute spoken commands. But that's not all. As we previously reported, the Gemini partnership should bring some new features, including the ability to answer questions conversationally and tell stories, to Apple's AI products as well. There will be a gap between the announcement and updated Siri actually rolling out to the public, but it shouldn't be anything like the delays we've seen so far. Gurman expects the new features to be bundled with iOS 26.4, which is currently slated to hit beta testing in February and launch in March or early April. Beyond April, we can look forward to bigger changes later in the year. Gurman insists that a "fully reimagined" version of Siri, internally codenamed Campos, will be announced at WWDC 2026 and form part of the iOS 27, iPadOS 27, and macOS 27 updates which roll out to the public in the fall. "The new system," Gurman writes, "is a fresh architecture and interface designed from the ground up for the chatbot era." It will be more conversational than current Siri or even the Siri in iOS 26.4, and capable of "sustained back-and-forth dialogue." This too will be based on Gemini technology. But there's a sting in the tail. While Apple rushed to emphasise that the upcoming Gemini-enhanced version of Siri will run on-device or on Apple's own servers, implying that this approach was chosen for privacy reasons, Santos may not. According to Gurman's sources, Apple and Google are currently discussing a plan to have it run on Google's cloud infrastructure to improve responsiveness. It's unclear how Apple would square this with its "industry-leading privacy standards," but it will have plenty of opportunities to make that argument in its various press engagements this year.
[14]
Siri 2.0 could finally reach your iPhone next month, with the Gemini-powered assistant due to gain even more abilities at WWDC this summer
It's been well over a year since these features were due to arrive The smarter Siri 2.0 we've been waiting to try on our iPhones for almost two years may finally appear next month, says Mark Gurman's latest Power On newsletter at Bloomberg. But even greater changes could be in store for Apple's summer developer conference. Gurman was writing about the Apple and Google AI deal months ahead of the official announcement. So when he says "[Apple] has been planning an announcement of the new Siri in the second half of February, when it will give demonstrations of the functionality," we have little reason to doubt him. Specifically, Gurman says the new Siri will arrive in iOS 26.4, which should arrive as a beta in February and then launch formally by early April at the latest. It will be this version of the iPhone's software that will finally give us the AI features first promised at WWDC in 2024 when Apple Intelligence was introduced to the world. This includes powers like Personal Context to keep track of your previous conversations, on-screen awareness and the ability to take action in apps on your behalf. All of these were demoed two summers ago, but have yet to make it to any beta version of iOS, let alone a stable version. Siri 3.0 could come a lot quicker than Siri 2.0 Gurman's account is a fun narrative of corporate negotiations and why Google (initially set aside as a potential partner) ended up being Apple's ultimate choice instead of the more 'obvious' choices of OpenAI or Anthropic. But it also tells us about what to expect from the next step forward to Siri, which should be coming this summer. We can apparently expect another overhaul to Apple's digital assistant to come with iOS 27, macOS 27 and the rest of Apple's next generation of operating systems. As well as being smarter than ever, this Siri will allegedly take cues from ChatGPT and Gemini to offer chatbot-style interaction and abilities, like answering questions in a conversational fashion or proactively suggesting actions to take in linked apps, according to one rumor. These new capabilities will come as a result of Apple moving the underlying processes to Google's infrastructure, to help with the speed and precision of responses. This makes sense, but the big question would be what impact this shift would have on Apple's Private Cloud Compute, its guarantee that your AI-powered prompts and results aren't visible to the company. We don't know if Apple will make a big deal out of the iOS 16.4 update with a formal launch, or if the new Siri will debut with barely a whisper. But it seems we don't have long to find out if these rumors are true, and if the new features are worth the wait. Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.
[15]
Apple's Siri Chatbot in iOS 27: Everything We Know
Apple is planning to upgrade Siri twice in the coming year, adding personalization features in iOS 26.4 before turning the personal assistant into a full chatbot in iOS 27. As long as timelines don't change, we'll see the Siri chatbot as soon as June 2026. Here's everything we know so far. SiriBot With iOS 27, Apple will change the way that Siri works. Right now, Siri can answer basic questions and complete simple tasks, but you can't engage it in a back and forth conversation, get help with multi-step tasks, or ask complicated questions. Based on the current Siri chatbot rumors, Siri will be able to do all of that and more with the upcoming upgrade, and it will work like competing chatbots. Apple wasn't initially planning to introduce a full chatbot that users can interact with similarly to Claude or ChatGPT, but chatbots have become too popular for Apple to ignore. Simply adding AI capabilities to apps and features isn't enough for Apple to stay competitive with the way people have embraced chatbots for everything from web searches to coding help. Google has already integrated Gemini into a range of Android devices, and chatbots like ChatGPT have hundreds of millions of weekly active users. Siri Capabilities According to Bloomberg's Mark Gurman, Siri's chatbot capabilities will be "embedded deeply" into Apple's products at the system level. Siri won't be an app, but will instead be integrated into iOS, iPadOS, and macOS like Siri is now. Siri Activation and Interface Users will activate Siri in the same way they do today, speaking the Siri wake word or pressing on the side button of a Siri-enabled device. Siri will be able to respond to both voice and text-based requests. We don't yet know what the new Siri interface will look like. Apple will need to make big changes to the way that Siri looks and feels if it wants to match functionality offered by companies like OpenAI, Anthropic, and Google. People are used to opening up an app and having a full text interface that includes conversation history, and it's not clear how Apple will provide that if there's no dedicated Siri chatbot app. People will want to be able to access their past conversations and have tools for uploading files and images. It's possible activating Siri could lead to an app-like interface that takes over the iPhone, iPad, or Mac's display, but that will be a departure from Siri's current minimalistic design. Apple could alternatively log conversations in a place like the Notes app, or in the clipboard on the Mac. Gurman says that Siri won't be an app, but that might mean that it won't only be an app. There could be some kind of dedicated chatbot app that people can use, with Siri also able to be activated and used on a system level and in and across apps. What Siri Chatbot Can Do It sounds like the Siri chatbot will be able to do everything that current chatbots can do, and more. * Search the web for information * Generate images * Generate content * Summarize information * Analyze uploaded files * Use personal data to complete tasks * Ingest information from emails, messages, files and more * Analyze open windows and on-screen content to take action * Control device features and settings * Search for on-device content, replacing Spotlight Siri will also be integrated into Apple's core apps, including Mail, Messages, Apple TV, Xcode, and Photos. Siri will be able to search for specific images, edit photos, help with coding, make suggestions for TV shows and movies, and send emails. iOS 26.4 "LLM Siri" vs. Chatbot Siri In iOS 26.4, Apple plans to introduce a new, updated version of Siri that relies on large language models, or LLMs. Apple has been working on this version of Siri since Apple Intelligence features were added to iOS 18, but it was delayed because Siri's underlying architecture needed an overhaul to run LLMs. Starting in iOS 26.4, Siri will be able to hold continuous conversations and provide human-like responses to questions, plus Siri will have new personalization features that will let it do more than before. What Siri won't have, though, is full chatbot capabilities. Here's what we're expecting: Personal Context With personal context, Siri will be able to keep track of emails, messages, files, photos, and more, learning more about you to help you complete tasks and keep track of what you've been sent. * Show me the files Eric sent me last week. * Find the email where Eric mentioned ice skating. * Find the books that Eric recommended to me. * Where's the recipe that Eric sent me? * What's my passport number? Onscreen Awareness Onscreen awareness will let Siri see what's on your screen and complete actions involving whatever you're looking at. If someone texts you an address, for example, you can tell Siri to add it to their contact card. Or if you're looking at a photo and want to send it to someone, you can ask Siri to do it for you. Deeper App Integration Deeper app integration means that Siri will be able to do more in and across apps, performing actions and completing tasks that are just not possible with the personal assistant right now. We don't have a full picture of what Siri will be capable of, but Apple has provided a few examples of what to expect. * Moving files from one app to another. * Editing a photo and then sending it to someone. * Get directions home and share the ETA with Eric. * Send the email I drafted to Eric. You're not going to have a chat-like interface for back-and-forth conversations with Siri when iOS 26.4 launches, but the personal assistant should be very different than it is now. Apple software engineering chief Craig Federighi told employees last summer that the Siri revamp was successful. "This has put us in a position to not just deliver what we announced, but to deliver a much bigger upgrade than that we envisioned," he said. Siri Redesign With all of the new functionality coming to Siri, Apple is planning to make visual design changes. It's not quite clear what that will entail, but for the upcoming table-top robot that's in the works, Apple has tested an animated version of Siri that looks similar to the Mac's Finder logo. Apple could start rolling out that new, more personalized design when Siri gets the major iOS 27 revamp. Memory Claude, ChatGPT, and Gemini can remember past conversations and interactions, retaining a memory of the user. Apple is said to be discussing how much the Siri chatbot will be able to remember. Apple may limit conversational memory to protect user privacy. Naming Siri is getting a major overhaul, but Apple will probably continue to refer to it as Siri. It'll just be a much smarter version of Siri. Underlying Architecture and Servers Apple has inked a deal with Google that will see Gemini powering upcoming versions of Siri. Apple plans to use Gemini for the iOS 26.4 updates that it is introducing, and Google's technology will also power the Siri chatbot. "Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology," the two companies said in a statement in January. The Siri chatbot specifically will rely on a custom AI model developed by the Google Gemini team. Gurman claims that the custom model is comparable to Gemini 3, and that it will be much more powerful than the model behind Apple's upcoming iOS 26.4 features. Apple and Google are also discussing running the Siri chatbot on Google's servers powered by Tensor Processing Units, probably because Apple doesn't yet have the infrastructure to handle chatbot queries from billions of active devices per day. In the future, Apple will be able to transition Siri to a different underlying model, so when the company does have in-house LLMs powerful enough to compete with ChatGPT or Gemini, it can move away from Google. Apple will also potentially be able to offer chatbot capabilities in China by partnering with a Chinese AI company. China restricts foreign companies from offering AI features in the country. Platforms Siri's chatbot functionality will be the key new feature in iOS 27, iPadOS 27, and macOS 27, and Siri's capabilities will be integrated into the iPhone, iPad, and Mac. Siri chatbot features could also come to other platforms like visionOS and tvOS. Cost There is no word yet on whether there will be some kind of fee associated with the Siri chatbot. The Siri chatbot won't be able to run entirely on device, and Apple is going to need major cloud processing power. Without taking into account any development or hosting costs, Apple is paying Google approximately $1 billion per year for access to Google's models. Companies like Google and OpenAI spend billions on infrastructure and compute costs each year, and no AI service is entirely free. Apple will likely need to charge something, but it could do what Google has done with Gemini. Google offers a free version of Gemini on Pixel smartphones and other Android devices that have integrated AI. The basic version of Gemini is able to answer questions, summarize text, write emails, and control apps and smartphone features. Android users can pay $20 per month for Gemini Advanced to get access to the more advanced version of Gemini that offers better reasoning, longer context for analyzing bigger documents, and improved coding. Launch Date Apple is planning to introduce Siri's chatbot capabilities when it announces iOS 27, iPadOS 27, and macOS 27 at the June Worldwide Developers Conference. If the chatbot features aren't ready to go, Apple will likely hold off on showing off the new functionality because of the major mistake it made with iOS 18 and Apple Intelligence. The Siri chatbot is expected to be introduced in the new updates in September after several months of beta testing.
[16]
Gemini-powered Siri could be days away from big reveal
Apple's revamped Siri, powered by Google's Gemini model, could be revealed soon. Bloomberg's Mark Gurman, a frequent Apple newsbreaker, reported that the company planned to reveal the Gemini-powered Siri in February. The announcement is highly anticipated, as Apple is turning to Google's technology to help deliver on its AI ambitions. Wrote Gurman in his Power On newsletter: "Apple appears to be less than a month away from unveiling the results of this partnership. The company has been planning an announcement of the new Siri in the second half of February, when it will give demonstrations of the functionality. Whether that takes the form of a major event or a smaller, tightly controlled briefing -- perhaps at Apple's New York media loft -- remains unclear. Either way, Apple is just weeks away from finally delivering on the Siri promises made at its Worldwide Developers Conference back in June 2024. At long last, the assistant should be able to tap into personal data and on-screen content to fulfill tasks." It's long been expected that Siri would receive a Gemini-powered, chatbot-style makeover. Now, that shift appears likely to arrive this year, potentially alongside iOS 27 in the fall. By then, using Siri could feel much like interacting with today's popular AI chatbots. Gurman reported, however, that some Gemini-powered features are expected to arrive earlier, with an iOS 26 update in the spring. The move surprised some earlier this month, when Apple announced it had struck a deal with Google to help power Siri. The two tech giants are, after all, nominal competitors. But Apple's AI efforts have lagged behind, and the companies reportedly reached an agreement that benefits both sides. Google framed the partnership this way: "After careful evaluation, Apple determined that Google's AI technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users."
[17]
'After years of waiting, iPhone users deserve nothing less': 5 things Gemini-powered Siri needs to do to save Apple from AI irrelevance
For years now, Siri has felt like the weak link in Apple's otherwise slick ecosystem. While ChatGPT, Gemini, and even Alexa have surged ahead, Apple's voice assistant has mostly stood still, promising big things but never actually fulfilling them. That's finally starting to change, however, as Apple and Google have entered into a long-term partnership where Siri will be powered by Gemini. According to Bloomberg's Mark Gurman, Apple could reveal its long-awaited Gemini-powered Siri upgrade as soon as February 2026 (next month), and that could mean huge things for the iPhone in a year where the Cupertino-based company looks to revolutionize its smartphone lineup with more powerful products and maybe even a foldable screen. As someone who's used Gemini on Android for months now, this is genuinely exciting. But confirmation alone isn't enough. If Apple wants this to be the Siri reset iPhone users have been waiting for, here are five things I really want to see when it finally arrives. 1. A Siri that can actually hold a conversation Right now, Siri still feels transactional. You ask a question, you get an answer, and the conversation ends. Ask a follow-up, and it's a coin toss whether Siri remembers what you were talking about in the first place. If Apple is plugging Gemini into Siri, that has to change. Gemini already handles conversational context well. It understands follow-ups, clarifications, and vague human language without needing you to repeat yourself like a robot. If I ask for restaurant recommendations and then say "book the second one," Siri should just get it. In 2026, with AI assistants capable of booking flights a reality, this should be the bare minimum for a Gemini-powered Siri. 2. Real help with real tasks Siri has always been fine at trivia and timers, and while that was enough in 2011, it's not anymore. A Gemini-powered Siri should help you do things, not just answer questions you could simply use Google Search for. Planning trips, summarizing emails, organizing your day, and making sense of information across apps should all be on the table, and I genuinely think that's the absolute minimum. Gemini already does this in apps like Gmail, which is why my expectations are high. If Siri can't help me plan a weekend away using my calendar, messages, and location data, then something's gone wrong. This is Apple's chance to turn Siri into a genuine personal assistant, and I'm hoping once Gemini enters the fray, it'll just be the beginning of a Google AI-powered iPhone capable of streamlining my life. 3. Intelligence without ignoring privacy Apple leaning on Google for AI will understandably make some people nervous. Privacy is still Apple's biggest selling point, and it cannot afford to fumble that trust now, especially when it's the main reason the company has gotten away with being behind in the AI race for so long. The good news is Apple has already laid the groundwork with on-device processing and Private Cloud Compute. The Gemini-powered Siri needs to feel just as safe with clear explanations of what data is used, when it leaves your device, and how it's protected. The experience should feel private by default, not something you need to opt out of, and if Apple gets this right, it could end up being the most privacy-conscious AI assistant on the market. 4. Memory, memory, memory In my opinion, memory is one of the most important features of any AI chatbot, and if you live in the UK like me, you'll know Google still hasn't fully launched Gemini's memory functionality across the pond. I'm hoping Apple's Gemini-powered Siri can remember everything I do on my device while still operating within Apple's industry-leading privacy bubble. I know I've mentioned memory across multiple sections in this article, but I truly believe Apple could win the hearts of every AI-sceptic if it were able to create the ultimate personal assistant in your pocket - one that's capable of remembering where you had breakfast, when your next meeting is, and what the last movie you saw at the cinema was. Apple can play into its privacy-first approach to make AI memory impressive rather than creepy, and if it does, I'll be convinced about Apple Intelligence again. 5. Siri that finally understands the Apple ecosystem Siri never actually feels connected to your experience on an Apple device. For years, it has just been an extra on top of iOS or iPadOS, which often feels like a gimmick more than a useful tool. A Gemini-powered Siri should understand that your iPhone, Mac, iPad, Watch, and Apple TV are all part of the same life. Context should carry over between devices, and everything should sync seamlessly. If I start a conversation on my iPhone, I should be able to continue it on my Mac without starting from scratch. If Siri knows what I'm doing on one device, it should use that knowledge on another. This is where Apple can really shine. No one else controls hardware, software, and services the way Apple does. A smarter Siri should finally bring all of that together. The Siri we've been waiting for is just around the corner Apple confirming a Gemini-powered Siri is a big deal, but it's also an admission that the company needs to do better. Apple knows it fell behind, but 2026 could be the year it rectifies the wrongs. If Apple nails these five things, Siri could go from being a punchline to being one of the best AI assistants you can use. Not because it's the flashiest, but because it's the most useful in everyday life. After years of waiting, iPhone users deserve nothing less. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[18]
Apple plans to turn Siri into a full AI chatbot to take on ChatGPT and Gemini
Apple plans to turn Siri into a full-fledged AI chatbot later this year, as it scrambles to catch up with rivals like OpenAI and Google after years of falling behind in generative AI. The move follows reports that Apple plans to power Siri's AI overhaul using Google's Gemini technology. The overhaul would fundamentally change how Siri works. Internally codenamed Campos, the new Siri is expected to replace the current interface entirely and behave more like a conversational chatbot. Recommended Videos Users would still summon it the same way, by saying "Hey Siri" or holding the side button, but the experience would feel closer to ChatGPT or Gemini, with the ability to type or talk back and forth more naturally. A more capable Siri is finally taking shape Bloomberg reports that the revamped Siri will be deeply embedded into iOS 27, iPadOS 27, and macOS 27, rather than living as a standalone app. It is designed to go well beyond today's Siri, with features like web searching, content creation, image generation, summarising information, and analysing uploaded files. The new Siri will also be tightly integrated with Apple's core apps, allowing users to do things like edit photos by voice, draft emails using calendar context, or find specific files, messages, and events more easily. The chatbot version of Siri is expected to be unveiled at WWDC in June as a key feature of iOS 27 and macOS 27, with a broader rollout likely in September. Before that, Apple plans a more limited Siri and Apple Intelligence update with iOS 26.4 this spring, which keeps the current interface but upgrades the intelligence using Google's Gemini models. That makes iOS 26 more of a stepping stone toward a bigger Siri reset later this year. Apple has not commented publicly on the report, but the scale of the overhaul points to growing pressure as rivals continue to bake conversational AI deeply into their platforms, leaving Apple with little room to keep Siri as it is.
[19]
Apple to 'unveil' results of Google Gemini partnership as soon as next month: report
Earlier this month, Apple and Google officially announced that they'd be partnering together. Apple has long struggled with its own model development, so now, Google Gemini models will power future Apple Intelligence features, using Apple's private cloud compute servers. Today, Bloomberg's Mark Gurman reports that this partnership is on track to debut in iOS 26.4 beta as soon as next month, and Apple plans to demonstrate the features to the public in some capacity. He also reports some interesting new details on how this partnership came to be. New Siri features coming soon It feels like we've been saying new Siri is 'coming soon' forever now, but things should finally be coming to fruition very soon - now that Apple's models are out of the picture. Bloomberg's Mark Gurman reports: Today, Apple appears to be less than a month away from unveiling the results of this partnership. The company has been planning an announcement of the new Siri in the second half of February, when it will give demonstrations of the functionality. Whether that takes the form of a major event or a smaller, tightly controlled briefing -- perhaps at Apple's New York media loft -- remains unclear. Either way, Apple is just weeks away from finally delivering on the Siri promises made at its Worldwide Developers Conference back in June 2024. These new Siri features in iOS 26.4 will make Siri more aware of what's on your screen, it'll know more about you, and it'll be able to take actions in apps for you. The Gemini models powering these features on Private Cloud Compute are internally known as Apple Foundation models v10, and it uses a 1.2 trillion parameter model. That's just the beginning, though - and Siri will be getting even more advanced in iOS 27. Last week, Gurman also reported that Apple will be implementing chatbot style features throughout iOS 27 and macOS 27. These features will be powered by whats known as Apple Foundation Models v11, and these models should be close in quality to Gemini 3. He describes it as "significantly more capable than one supporting the iOS 26.4 Siri." These features might rely on Google's infrastructure however, due to their much more advanced nature. Talks for iOS 27 features are still ongoing, though. Apple almost acquired another AI lab Leading up to the announcement of the Google Gemini partnership earlier this month, Apple was also exploring a few other pathways. Last June, when Bloomberg reported that Apple was exploring using a third party model provider over its own models, leadership pushed back internally. Siri executive Mike Rockwell allegedly called the reporting BS in an emergency meeting at the time. Despite his words, Apple was also 'already in discussions' with both Anthropic and OpenAI to supply models for Apple Intelligence Siri. However, talks with Anthropic stalled around August, when the company wanted "several billion dollars annually over multiple years." Making a deal with OpenAI also wasn't favorable, as the company was actively poaching Apple devices and pursuing hardware with Jony Ive. After a court ruled that Apple's partnership with Google over Google Search being the default on the iPhone wasn't illegal, a deal with Google for Gemini became much more favorable. Wrap up All in all, we should be seeing our first Apple Intelligence Siri features very soon. It's hard to feel super excited right now, since its been nearly two years since we first heard about the features, but I feel that'll probably change once I get to try things out for myself. Are you looking forward to the debut of the first major Apple Intelligence Siri features? Let us know in the comments. My favorite Apple accessory recommendations:
[20]
Siri will reportedly evolve into a full-fledged chatbot this fall
The upgraded Siri will feature a new interface, modular design for future updates, and enhanced privacy through Apple's Private Cloud Compute servers. A new report from Bloomberg's Mark Gurman details changes coming to Siri this fall, with the introduction of iOS 27, macOS 27, and the other annual updates. Siri, currently just a simple digital assistant, will undergo a transformation into a complete AI chatbot, with a new interface on iOS, iPadOS, and macOS. With the OS 26.4 update (likely to arrive with the next 2-3 months), Siri is still expected to undergo a massive overhaul. While the interface will remain the same, the "brains" of Siri will be swapped out for what is known internally as "Apple Foundation Models version 10." It is based on Google's Gemini technology and operates with 1.2 trillion parameters. This version of Siri will be smarter and more capable, including the ability to get information from the web. It will also bring the capabilities Apple promised back in summer 2024 -- the ability to read and understand the user's screen, take actions within apps, and build a profile of the user based on their interactions so it can operate with "personal context" that is relevant to the individual. The next big update comes a couple of months later in the OS 27 updates with a version of Siri code-named Campos. This will be built on an even more capable foundation model, Apple Foundation Models 11, which Gurman claims is comparable to Gemini 3. In the coming OS 27 update, users will summon Siri the same way they do now, by saying the "Siri" codeword or holding the side button. But the interface will change from a simple listening assistant to a full-fledged chatbot interface. It will do what chatbots today do -- answer questions, search the web, create text and images, and analyze uploaded files. But it will also be able to recognize open windows on a Mac or screen contents on iOS or iPadOS in order to take actions or suggest commands. It will of course still be able to do things third-party chatbot services can't today, like control smart home devices, set timers, and control device options and features (like enabling airplane mode or turning on the flashlight on your iPhone). Siri will also be more deeply integrated into Apple's built-in applications like Mail, Photos, Music, Podcasts, TV, and Xcode. So users will be able to issue commands like finding a photo based on specific parameters and editing it in a specific way. "Take the photo I took at the park yesterday and crop it to show only my face" is the sort of thing it seems Apple is aiming to enable. The report says Apple is currently discussing how much about the user the chatbot will be allowed to remember. Chatbots such as ChatGPT and Gemini can remember past interactions and conversations to better fulfill requests according to the user's past preferences and details. But Apple may severely limit this ability in the interest of protecting user privacy. This could be particularly important as Apple is currently discussing whether to perform cloud-based AI activity for this future Siri on Google's own Gemini servers, which run the company's own powerful TPUs (Tensor Processing Units). The Siri update coming in OS 26.4 will run on Apple's own Private Cloud Compute servers, powered by Apple Silicon. Finally, Bloomberg reports Apple is designing Campos, the OS 27-era Siri, in a modular way so that the underlying models can be swapped out over time. This will allow them to move away from Gemini-powered models in the future in favor of their own, or move to another company's model. It will also make it easier to change this feature to work in restricted regions where it may be required to operate with a foundation model from a specific company. This report neatly builds on EnchantΓ©, the Apple internal chatbot for testing and productivity we exclusively reported earlier today.
[21]
Apple reportedly revamping Siri as company's 'first artificial intelligence chatbot' -- and it 'will go well beyond the abilities of the current Siri'
We've been waiting for an overhauled Siri for nearly two years, but Apple may already be developing a Siri 3.0 that could debut in late 2026. According to a new report from Bloomberg's Mark Gurman, Apple is in the process of building a new chatbot -- dubbed Campos -- that will finally give Siri true AI features. This news comes as Apple is supposed to finally release the upgraded Siri 2.0 along with iOS 26.4 this spring. Thats when we should get a more AI-capable Siri that has been missing since Apple introduced Apple Intelligence in 2024. Right now, Siri is not considered a chatbot like OpenAI's ChatGPT or Google's Gemini, both of which have undergirded Apple Intelligence, with Gemini running Siri soon. Siri doesn't truly have a chat-like back and forth like those assistants. According to Gurman, Campos will be embedded in iOS, iPadOS, and macOS and be "summoned" in the same manner you do with Siri. Campos is apparently a major part of Apple's plan to right the boat on AI, where it has struggled. The overhauled bot will be able to act like ChatGPT or Gemini. However, unlike those, it will be integrated into core Apple apps like mail, music, podcasts and photos. It would even be able to control features and settings like making phone calls, setting timers, and opening the camera. What's coming for Siri and Campos When iOS 26.4 finally drops, it should bring AI capabilities that were announced back in 2024 but have yet to actually debut. These include giving Siri the ability to analyze content on your screen and use personal data for recommendations. The chatbot capabilities, likely boosted by Gemini, won't launch until later this year. Gurman claims Apple will announce features during the Worldwide Developers Conference in June. Campos is supposed to have voice and text-based chat features. This new chatbot is supposedly being integrated into iOS 27, iPadOS 27, and macOS 27, all three of which should debut alongside the iPhone 18 series in September. The last couple of years, Apple has been slow rolling operating systems with major features held back for huge updates later in the year. It's not clear from Gurman's report how much of Campos will be available when iOS 27 releases or if it will be released piecemeal. A change of tune In June, Tom's Guide global editor Mark Spoonauer spoke with Apple's senior vice president of software engineering, Craig Federighi, and Greg Joswiak, the senior VP of worldwide marketing, to discuss Siri and Apple Intelligence. At the time, Frederighi said that Apple was not interested in pursuing a chatbot, saying that people prefer AI that is woven into the user experience and not a separate model. He said Apple wanted to "meet people where they are" with AI. "We aren't defining what Apple Intelligence was to be a chatbot," he said. "It was never the goal." However, Apple has lagged behind AI competitors, which can be seen in the way the company has struggled to bring an AI-functional version of Siri to the table. This change also follows AI leadership changes, with Federighi taking over for the beleaguered AI chief John Giannandrea this past December and hiring Amar Subramanya as president of AI. Subramanya previously led engineering for Google Gemini. With WWDC 2026 in June and the iPhone launch in September, we'll get our first look at how a Gemini-based Campos AI will look in the coming months. Eventually, Apple believes it can swap out Gemini for its own internally built models. Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.
[22]
Apple's Siri Chatbot May Run on Google Servers
Apple is considering a significant shift in how it operates Siri by potentially running its next-generation chatbot on Google's cloud infrastructure rather than entirely on its own Private Cloud Compute servers, according to Bloomberg's Mark Gurman. In yesterday's report detailing Apple's plans to turn Siri into a chatbot in iOS 27, Gurman said that the company is in discussions with Google about hosting the forthcoming Siri chatbot on Google-owned servers powered by Tensor Processing Units (TPUs), a class of custom chips designed specifically for large-scale artificial intelligence workloads. The arrangement would mark a major departure from Apple's emphasis on processing user requests either directly on-device or through its own tightly controlled Private Cloud Compute infrastructure. In a potential policy shift for Apple, the two partners are discussing hosting the chatbot directly on Google servers running powerful chips known as TPUs, or tensor processing units. The more immediate Siri update, in contrast, will operate on Apple's own Private Cloud Compute servers, which rely on high-end Mac chips for processing. The near-term Siri improvements in iOS 26.4 are still expected to run on Apple's own Private Cloud Compute servers, which the company unveiled in 2024 as a privacy-focused alternative to on-device processing. Private Cloud Compute relies on Apple-designed servers built around high-end Mac chips, and Apple has positioned the system as one where user data is processed temporarily and not retained, not even being accessible to Apple itself. Those claims have been central to Apple's public messaging around Apple Intelligence. The more advanced Siri chatbot planned for the following major operating system update is expected to rely on a newer and more capable large language model developed by Google. This model is internally referred to as Apple Foundation Models version 11 and is comparable in capability to Google's latest Gemini models. Running such a model at scale may exceed the practical capacity of Apple's current Private Cloud Compute infrastructure, prompting the need to use Google's significantly larger, specialized cloud footprint and AI hardware. The possibility of running Siri requests on Google servers does not necessarily mean Google would gain access to user data in a conventional sense. Apple already relies on third-party cloud providers, including Google, for parts of iCloud's infrastructure, while retaining control over encryption keys and data handling policies.
[23]
Apple Might Turn Siri Into an AI Chatbot to Rival ChatGPT
The new interface would be in addition to the personalization upgrades previously rumored for spring. Last week, Apple finally admitted it will need to team up with Google to finally make good on that contextual Siri promise it made two years ago, which would have allowed the virtual assistant to integrate with content like your texts or emails to answer personal questions and take actions for you. Now, according to a new report, the iPhone company might actually go one step further and turn Siri into a full-fledged AI chatbot -- one on par with the likes of ChatGPT, and perhaps even more sophisticated. Currently, Siri has AI implementation, but only technically, and it's certainly underwhelming: You can use it to get tech support on Apple products or shunt questions off to ChatGPT, but otherwise, Siri basically works as it always has. But according to Bloomberg's Mark Gurman, who has reliably reported on insider information at Apple before, the company is finally not only looking to make Siri smarter, but also change the way you interact with it. Currently planned for iOS and macOS 27 under the name "Campos," Siri's new chatbot interface will still be powered by Gemini, but will allow you to both type and talk to Siri, with full continuity between your conversations. This upgrade will be in addition to the overdue features that were already announced. In other words, it'll look something like the chatbot interface from the ChatGPT app or the standalone Gemini app. Yes, you can technically type to Siri right now, but it mostly works like a separate input method, rather than as a full conversation. You can't scroll through your previous questions to Siri or peruse the assistant's previous answers, and if you ask Siri to reference a message you sent it two weeks ago, it'll have no idea what you mean. That's far behind what other AI chatbots offer right now. The update will also apparently further expand Siri's capabilities even beyond the contextual or personalization upgrades that were already revealed. Gurman says that, while the contextual upgrades will be able to pull information from other apps like Messages, the chatbot-style Siri will be "integrated into all of the company's core apps, including ones for mail, music, podcasts, TV, Xcode programming software and photos." Essentially, Siri will have more access to your iPhone than other AI chatbots, and those integrations will go beyond what was previously promised. That could make it more or less appealing to you, depending on your tastes in AI integration. When the new Siri could arrive With the chatbot interface planned for iOS 27, it's likely to come after the contextual upgrades, rather than at the same time. That's because, as Gurman said previously, those upgrades are set for the spring. He predicts we'll learn more about it during this year's WWDC, which, if it follows the standard set by previous years, will take place in June. The move to turn Siri into a chatbot could come across as a a much-overdue modernization, as Google has already done the same with Gemini over on Android, but it's also a bit of a surprise, as Apple had previously said it did not intend to turn Siri into a "bolt-on chatbot on the side" for Apple Intelligence. But Apple was likely talking about quality of the experience rather than expressing any significant anti-chatbot bias among the development team, meaning the fact that Siri is turning into a chatbot could mean the company is finally happy with the direction it's headed. But it's also possible that the professed skepticism about turning Siri into a chatbot was meant to appeal to AI skeptics in general. Unfortunately, if you're still skeptical about AI, it currently seems like iOS 27 will be a boring update for you, as Gurman indicated the new Siri chatbot will be the "primary new addition" to the operating system. However you feel about it personally, Siri as a full-fledged AI chatbot could seriously upset ChatGPT's market dominance -- ironic, given its early integration with Apple Intelligence. Currently, OpenAI has reportedly admitted it's in a Code Red situation, as it is losing market share to Google and introducing ads to bolster its bottom line. The new Siri, being powered by Gemini, is unlikely to hurt Google (although it will have more access to your phone than the standalone Gemini app), but its ease-of-access might make it the new go-to for iPhone users, and that could hurt pretty much every AI company Apple isn't in business with directly.
[24]
Apple is reportedly remaking Siri in ChatGPT, Gemini style
The AI upgrade to Siri, Apple's creaky old voice assistant, has been brewing for some time. Now we have more details, courtesy of a report from Apple super-scooper Mark Gurman. As expected, the new version of Siri -- codenamed Campos -- will use a fine-tuned version of Google Gemini for its on-board intelligence. But the bigger claim is how embedded it will be throughout the iPhone experience -- and the features it is adding in common with Gemini and its OpenAI rival, ChatGPT. Here's how Gurman describes what Apple insiders consider the "central feature" of its all-new Siri, apparently set to be unveiled by Tim Cook at WWDC 2026: "A chat-like feel and the back-and-forth conversational abilities of OpenAI's ChatGPT or Google's Gemini." If unveiled in June and released in September as reported, all iPhone, iPad, and Mac users will be able to upgrade to ChatGPT-like chatter, courtesy of pressing a button or saying the magic word "Siri." Everything they could formerly do on the ChatGPT app would be directly accessible anytime they're holding their phone. And oh yes, word to OpenAI (and Google, for that matter) -- Apple almost certainly won't include ads, as the free model of ChatGPT is about to add. Apple's business model has long revolved around selling luxury hardware with the least software friction. What would a ChatGPT-like Siri mean? If this all turns out to be the case, the result could be earthshaking for the AI business. Here's why: The iPhone is in the lead, breaking away from Samsung for the first time as of this year. Increasingly, it's the world's bestselling smartphone. If the entire iOS gets a chatbot upgrade baked in, and if ChatGPT doesn't give you anything you can't get via Siri, including the friendly chatty l'il know-all chatbot experience, the question becomes: How many out of the estimated 67 million daily active users of ChatGPT would still want the friction of opening an app? A mass migration on iOS would come at the worst possible time for Sam Altman, who is reportedly burning through a billion dollars of OpenAI's war chest a month. And Altman's problems may pale in comparison to Nvidia's if another part of Gurman's report pans out. Google and Apple are discussing hosting those millions of iPhone-based Siri chats on Google servers, using the company's specialized TPU chips -- a direct rival to Nvidia's line of GPU chips that have made it, for the moment, the most valuable company in the world. Apple, currently the third most valuable company in the world, in concert with Google, now the second-most valuable, may be about to turn that around -- even before the arrival of a new CEO.
[25]
I've been an Apple user for 15 years - here's why I'm actually excited about the Siri chatbot U-turn
* A new report says Apple is working on an AI-powered Siri chatbot * This could come with new features its rivals can't match * I'm excited for what it might mean for Apple's devices After what seems like an eternity of denials, it finally looks like Apple has caved to the pressure and plans to release its own Siri-branded chatbot to compete with the likes of ChatGPT and Google Gemini. And as a long-time Apple user, I've got to admit that I'm actually pretty excited about that. These days, I find myself spending more and more time turning to ChatGPT rather than Google Search whenever I have a question. I haven't abandoned Google completely, but the back-and-forth nature of a chatbot - where I can refine my question, ask for follow-up information, and get targeted answers - is particularly useful when I have a specific problem that Google just can't seem to resolve. That's exactly the kind of helper I've long felt has been missing from Apple's own suite of apps and operating systems. Sure, the Spotlight search tool got more powerful in macOS 26, and Apple has started weaving artificial intelligence (AI) into its operating systems, but it's no secret that Siri lags well behind its AI rivals in this area. Yet far from just catching up with the rest of the market, it looks like the Siri chatbot could have some significant advantages over its competitors, as reported by Bloomberg's Mark Gurman. The journalist says that "Unlike third-party chatbots running on Apple devices, the planned offering is designed to analyze open windows and on-screen content in order to take actions and suggest commands. It will also be able to control device features and settings, allowing it to make phone calls, set timers and launch the camera." These are the features that were touted at Apple's Worldwide Developers Conference (WWDC) 2024. Combined with Siri's upcoming chatbot functionality, they could transform Apple's virtual assistant into an all-around helper that offers a suite of features that even its strongest rivals can't match. That's a lot to hope for, but if Apple can even come close, we could soon see a significant shake-up in the AI market. Reasons to be hopeful Interestingly, the rumors suggest that Apple will take a similar approach with the Siri chatbot. Instead of a standalone app as you get with ChatGPT, Gemini, and other chatbots, Apple will instead "integrate the software across its operating systems, like the Siri of today," Gurman says. That strikes me as the right approach. Given the amount of power it could potentially have, I want the Siri chatbot to be available in whichever app I'm using. Having it as a separate app risks breaking your flow and interrupting your work. Mixing it into existing apps could be a much more natural way to interact with the AI. And let's not forget all the updates we were promised back in 2024, which I mentioned earlier, including the ability to understand your personal context and work within and across apps. With both of these major revamps planned for 2026, this could be the biggest year in Siri's history - and might finally help Apple catch up in the high-stakes world of AI. There are risks to Apple's about-turn, though. For one thing, Gurman reports that Apple might have to rely on Google's servers in order to quickly scale up Siri's capacity. With much of Google's business model centered on advertising and data collection, user interactions with a chatbot would be a tempting target for the company. Yet here, I'm confident that Apple will be able to ensure that your info is protected and sectioned off from any kind of data collection, just as it did when it announced a "multi-year collaboration" to power Siri using Google Gemini models. If Apple is able to empower Siri while protecting user privacy, that's a win-win situation. That all means that 2026 is shaping up to be a make-or-break year for Siri. Based on what we've heard so far - and Apple's willingness to change tack and launch a Siri chatbot - I'm optimistic for the future. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[26]
Siri set to become Apple's first AI chatbot in late 2026?
As Apple continues to play catch-up in the AI landscape, Siri looks set to be revamped into the iPhone maker's first AI chatbot. Having fallen well behind rivals like OpenAI, Google and Microsoft, and underwhelming users with its Apple Intelligence product, Apple is betting on a revamped Siri AI chatbot to turn its AI fortunes around, according to Bloomberg. Google will supply the underlying technology. A long-anticipated deal was inked earlier this month and in a joint statement on 12 January, Apple and Google said they had "entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology". "These models will help power future Apple Intelligence features, including a more personalized Siri coming this year," the companies said. "After careful evaluation, Apple determined that Google's Al technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users." Now sources have told Bloomberg the chatbot is code-named Campos and "will be embedded deeply into the iPhone, iPad and Mac operating systems and replace the current Siri interface". The chatbot is likely to be released as part of IOS 27. The Siri AI chatbot appears to be separate from the long promised Siri update expected earlier in the year in IOS 26.4. Meanwhile The Information is reporting that Apple is working on a "small, wearable AI pin equipped with multiple cameras, a speaker, and microphones", which may run the new Siri chatbot that Apple plans to unveil in iOS 27. This all comes at a time when OpenAI looks set to launch their first physical devices later this year, thanks to a longtime collaboration with former Apple design guru Jonny Ive. So there must be quite the sense of urgency. Don't miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic's digest of need-to-know sci-tech news.
[27]
Despite everything, a Siri chatbot does seem the right way to go
A report yesterday said that we will see a Siri chatbot as part of iOS 27, despite the company previously dismissing this idea. If accurate, this will see the company adopt a two-stage strategy to finally giving the new Siri the smarts it has long lacked ... From science fiction to embarrassment I've said before that when Apple first launched Siri way back in 2011, it seemed like a big step toward something that had previously been science fiction. It was the feature that had me upgrade to the iPhone 4S, and I was very impressed with it in those heady early days. Fast forward to 2026, however, and the so-called intelligent assistant has long been an embarrassment for Apple for all the reasons we've discussed at length previously; I don't need to rehash them here. There is one Siri shortcoming I do need to discuss, but I'll get to that. Stage 1: A Gemini-powered Siri Apple has long promised a much smarter new Siri, which was originally going to be powered by Apple Intelligence. That ... did not go well. Things changed dramatically last week, however, when the company confirmed reports that the new Siri will be powered by Google's Gemini models. While OpenAI's ChatGPT models took a dramatic early lead in generative AI, it's now widely considered that Google's Gemini is at least their equal and in many respects their superior. Google's beta launch of its new Gemini-powered Personal Intelligence feature gives us a good preview of what we can expect from the new Siri. The core benefit is the model's ability to use a complex mix of sources to generate its responses, including personalized information pulled from the Apple apps and services we use. In terms of Siri being able to act as an intelligent agent to get tasks done, this will be a truly revolutionary improvement compared to present-day Siri. Stage 2: A Siri chatbot Apple initially expressed the view that chatbots were not a particularly useful user interface for an onboard intelligent assistant. I suspect the reason for this was that the company was very much focused on agentic AI capabilities - telling Siri what it is we want to achieve and having it use the onboard apps in order to complete the task. For example, we might ask Siri to book a table for dinner at that Thai restaurant we went to a few months ago and really loved. Siri might be able to retrieve the name of the restaurant using information it is able to pull from things like a text message confirmation of the previous reservation, our Apple Maps history, or a photo we took while we were there. It can then use a reservation app to book a table for us and let us know when we receive confirmation. I do personally think that agentic AI is set to be the most useful development in this field, but that doesn't mean that there is no role at all for a chatbot. The role of a chatbot I said there was one Siri failing I do need to discuss here, and that's the fact that it is appallingly bad at context. If you ask Siri one question or give it one instruction, and then immediately follow-up with a related question, it will often act like it has absolutely no idea what you were talking about just three seconds earlier. Q: Hey Siri, who played Batman in the most recent movie? A: in May 2019, Robert Pattinson was cast as Bruce Wayne/Batman Q: What about Robin? A: I don't know who Robyn is This is an area where chatbots shine, getting progressively better at it over time. For example, these are the opening responses from a series of questions I just asked ChatGPT, deliberately using the kind of vague language a typical non-techie might use. That's a flow Siri can only dream of at present. ChatGPT can even reference topics we discussed days or weeks ago. (Though it claims it didn't know it was referencing something I wrote, when I asked the question half an hour later.) Of course, all the usual chatbot cautions apply: incorrect answers, outdated information, and sometimes wild hallucinations. (Although the examples I give above are using it like Google, because that's what most people do, I don't personally tend to use it in that way: I mostly use it for brainstorming ideas.) We should expect plenty of glitches from the new Siri. But none of that changes the fact that conversational flows are an extremely useful capability, so I do think Apple has made the right decision to go this route as a supplement to agentic AI features. Do you agree or disagree? Please share your thoughts in the comments.
[28]
Apple is about to deliver on the next-gen Siri promises it made in 2024
TL;DR: Apple has partnered with Google to integrate Gemini AI models, addressing delays in upgrading Siri's intelligence. The enhanced Siri, featuring personal data access and on-screen content recognition, will debut in iOS 26.4 beta next month, with a full launch expected at WWDC 2026 alongside iOS 27 and macOS 27. Apple has fallen significantly behind the competition when it comes to creating an in-house AI model capable of sufficiently upgrading Siri to the level the company advertised in 2024, and following reports of internal struggles with Apple Intelligence-related development teams, Apple finally fell on its sword and signed a deal with its direct competitor, Google, for access to its Gemini AI models. Apple intends to use Gemini to power upcoming Apple Intelligence features, and ultimately give Siri a most-welcomed upgrade in the intelligence department. Initially, Apple was in discussions with Anthropic for access to its models, but following drawn-out conversations and an unfavourable payment structure for Apple, Anthropic was no longer an option. OpenAI was another possible contender, especially since the two companies have already inked a deal to integrate ChatGPT into Siri, which is triggered only when the user's prompt can't be fulfilled by Apple's own AI model. However, given that OpenAI is currently working on releasing its first physical device, which is being helmed by former Apple designer Jony Ive, and that device is meant to be a way users can communicate with AI, Apple recognized potential conflicts of interest with going with OpenAI. That left Google, which is now providing Apple with Gemini models for Siri and future Apple Intelligence features. So, when will we see the fruits of this new partnership? According to Bloomberg reporter Mark Gurman, Apple is expected to announce the upgraded Siri during the second half of February, where it will showcase the new and improved functionality. Gurman writes that the voice-assistant should be able to tap into personal data and "see" on-screen content. Those new features are expected to be released in iOS 26.4, which is scheduled to enter beta testing next month and be released publicly in March or early April. The fully reimagined Siri has an unveiling planned for Apple's World Wide Developers Conference 2026, and that next-generation Siri will debut in iOS 27, iPadOS 27, and macOS 27.
[29]
Apple to revamp Siri as system-level AI chatbot in iOS 27
Apple will reportedly transform Siri into a chatbot, integrating it into iOS 27, according to a Bloomberg report. This development could become a central announcement at Apple's Worldwide Developers Conference (WWDC) in June. The Siri chatbot, internally designated "Campos," will support both voice and text inputs. Apple Senior Vice President Craig Federighi had previously indicated a preference against Siri operating as a chatbot, advocating for integrated AI functionality. This shift appears to stem from heightened competitive pressures from other companies' AI chatbot successes. Apple reportedly faces a potential threat as OpenAI plans to enter the hardware market, led by former Apple design head Jony Ive. The company has lagged in the AI sector, delaying a "more personalized Siri" multiple times. Last year, Apple explored partnerships with AI firms, including OpenAI and Anthropic, before confirming earlier this month its selection of Google's Gemini as its AI partner.
[30]
Will Apple Charge for Its Siri Chatbot?
Apple is apparently working on a Siri chatbot that will rival Claude, Gemini, and ChatGPT, and Apple is aiming to debut it in less than six months when iOS 27 is unveiled at WWDC. Bloomberg shared details on the chatbot earlier today, but there was one major question unanswered: what will Apple charge? Anthropic, Google, OpenAI, and other companies that run major chatbots offer a free version, but it's often throttled and a paid subscription is required for full functionality. Apple is reportedly planning to integrate its Siri chatbot deeply into iOS, iPadOS, and macOS instead of offering a standalone app. A Siri chatbot available on billions of devices is going to be expensive to run, but Siri is also so core to Apple products that people aren't going to want to pay for what's always been free. What the Siri Chatbot Can Do Per Bloomberg, the Siri chatbot will be able to "search the web for information, create content, generate images, summarize information and analyze uploaded files." It will also be able to control Apple devices and use personal data and on-screen information for search and to complete tasks. That sounds like just about everything that existing chatbots like ChatGPT can do, plus Apple is integrating the chatbot into all of its apps. On-Device Siri Chatbot? Some of those tasks can be completed on-device using the powerful A-series and M-series chips Apple has been building into its products, but Apple is using a custom AI model developed with the Google Gemini team. According to Bloomberg, the model is roughly comparable to Gemini 3, and the full version of Gemini 3 can't run on a high-end Mac, let alone a mobile device. Apple is going to need servers to run the Siri chatbot, and while it has been building Private Cloud Compute servers for AI features, it's unlikely that it has enough for a Siri chatbot. Bloomberg suggests that Apple is actually discussing running its chatbot on Google servers, and Google isn't going to do that for free. Compute Costs and Infrastructure Whether Apple is using its own private cloud compute servers or Google's Tensor servers, it needs serious compute power. Every question Siri is asked and every image Siri generates will cost Apple. OpenAI is not profitable, and it spends billions on inference each year. OpenAI has committed to spending $1.4 trillion on infrastructure to keep up with demand, an amount of money that it doesn't have yet. Google spent $85 billion on infrastructure to meet AI demand in 2025. In August, Google said that the median Gemini Apps text prompt uses 0.24 watt-hours of energy. At scale, across all Google devices and all Google products, that's hundreds of millions of dollars per year just in electricity costs. How Gemini is Priced Google has already integrated Gemini into its Pixel smartphones and other Android devices. It has a split tier system that Apple might adopt. Android users have access to a free version of Gemini that costs Google less to run. It can answer questions, summarize text, write emails, and control apps and smartphone features. Android users have to pay $20 per month for Gemini Advanced to get access to the more advanced version of Gemini that offers better reasoning, longer context for analyzing bigger documents, and improved coding. Apple could do something similar, offering a basic version of Siri that's accessible to everyone, with more advanced models available with a subscription. iCloud already provides a model for a free/paid product split. Apple offers all Apple users 5GB of cloud storage for free, but anything more will cost you. Temporarily Free? Apple could make its Siri chatbot free to use to begin with, which would lure users who are paying for other services like ChatGPT. ChatGPT, Claude, and Gemini are all around $20 per month, so Apple eating Siri chatbot costs for a year or two would be hard to compete with. Even undercutting current prices would likely lure customers and make Apple an immediate key player in the AI market. Right now, Apple Intelligence is entirely free to use even for images generated with Image Playground, but the capabilities are limited and some functionality runs on-device. Possible Cost Apple might not be able to absorb AI costs, and there could be paid options right when the Siri chatbot launches. If that's the case, pricing will likely be competitive with existing chatbots. AI companies have decided entry-level plans should cost $20/month, but it's not clear if that price point is actually sustainable with the growing costs of training new models and supporting more users. * ChatGPT Plus - $20/month * Copilot Pro - $20/month * Gemini Advanced - $19.99/month * Claude Pro - $20/month * Perplexity Pro - $20/month Siri ChatGPT Integration Right now, Apple has a partnership with OpenAI to hand complex requests off to ChatGPT. Apple doesn't pay OpenAI for this feature, but it does put ChatGPT in front of millions of Apple users. When Apple launches its Siri chatbot, ChatGPT integration could be removed. Eliminating the ChatGPT integration might also impact Apple's legal battle with Elon Musk. Musk's xAI company sued Apple and OpenAI for colluding to promote ChatGPT over other AI bots like Grok, arguing that Apple should let other chatbots integrate with Siri. If Apple stops offering ChatGPT through Siri in favor of its own Siri chatbot, it would be no different than Google integrating Gemini into all Android devices, or Meta limiting its smart glasses to Meta AI. Launch Timing We'll probably be hearing more about the Siri chatbot in the coming months. Apple is aiming to unveil the functionality in iOS 27, iPadOS 27, and macOS 27, which will be previewed in June at WWDC.
[31]
Apple to Show Off New Siri Running on Google Gemini in February
Apple's been working on a Siri overhaul for what feels like forever. The company keeps pushing back release dates. According to Bloomberg's Mark Gurman, Apple is planning a February reveal for its revamped Google Gemini Siri with demonstrations showing what the partnership actually delivers. It's unclear whether Apple will hold a full event or stick to private media briefings. This marks the first public look at Siri running on Google's technology since the companies confirmed their $1 billion partnership earlier this year. The new assistant is codenamed Campos internally. It represents a complete reversal for Apple executives who spent months insisting they didn't want a chatbot-style assistant. What the New Siri Can Actually Do The updated assistant should tap into personal data and on-screen content to complete tasks. Apple first previewed these capabilities at WWDC 2024. The demo showed someone asking Siri about their mother's flight details and lunch plans. The assistant pulled information from Mail and Messages automatically. That kind of cross-app awareness has been missing from Siri for years. Competitors like ChatGPT and Gemini made it standard. However, the February demo is just a preview. According to Gurman, the new Siri will ship with iOS 26.4. That version enters beta testing in February before releasing to the public in March or early April. This means iPhone 15 Pro users and newer could actually use the Gemini-powered features within a few months. The update runs on Apple's own Intelligence framework with Google's technology integrated underneath. For those worried about Google getting access to their data, you don't have to be. No user data gets shared with Google's servers. Then come iOS 27, a full chatbot version of Siri is expected to be part of the update. It'll allow sustained back-and-forth conversations similar to ChatGPT or Gemini. The assistant will be built directly into iPhone, iPad, and Mac with no separate app required. Gurman says the chatbot will be competitive with Gemini 3. It'll be significantly more capable than the version launching in iOS 26.4. The timing makes sense for Apple. The company needs to deliver something impressive after delaying Siri's AI features multiple times and facing internal skepticism about whether the technology actually works reliably.
[32]
Apple's smarter, long-promised AI-powered Siri might be arriving in a few months, according to a new report
* A new report says the more personal Siri will launch with iOS 26.4 * This upgraded Siri will use Foundational Models built on Gemini * It comes just days after Apple confirmed a partnership with Google Just nine days after Apple partnered with Google to use Google's Gemini models and cloud technology to base its next-generation of Apple Foundation Models, it appears that the AI-powered Siri that the Cupertino-based giant first promised to us all back in 2024 might be coming way, way sooner than any of us thought. According to a new report from Bloomberg's Mark Gurman, the rumor mill is doubling down on the rollout of the AI-powered Siri being released in Spring 2026 - now writing that it's "planned for iOS 26.4," which is likely due in the coming weeks or months. Right now, Apple is seeding iOS 26.3 to registered developers and folks enrolled in the public beta program. The report notes that iOS 26.4 will deliver on the promised features Apple first teased at WWDC 2024 as part of Apple Intelligence - things like Siri having a deeper personal understanding of you, based on what it can see on your screen, and being way smarter. Meaning that it could also pull up topical information in a jiffy, making it a much more impactful and meaningful user experience. More importantly, though, is the timing here - this report, which comes alongside a rumor that Apple will launch an AI chatbot with iOS 27, one that competes directly with Google Gemini and ChatGPT, notes that this updated Siri wouldn't likely be possible without the partnership with Google for that foundational base. The iOS 26.4 version of the AI-powered Siri "will rely on a Google-developed system internally known as Apple Foundation Models version 10," the report reads before noting it operates at 1.2 trillion parameters. Clearly, access to Gemini models to build and stand Apple's own Foundational Models on top has sped things up and might mean that Apple will be able to deliver on the smarter, more personal Siri it's been promising for a long time, much sooner, rather than later. As for when iOS 26.4 will arrive, it's likely in the Spring, after the start of March and before the end of May. Meaning it would likely arrive before WWDC 2026, which is expected in June. Of course, Apple could save this news for a bit of consumer buzz at WWDC and ship 26.4 to all users with an Apple Intelligence capable device, and thus the access to the new Siri, right after the keynote wraps. It would be a surprise-and-delight moment, but would also push aside the usual public beta and developer testing periods. Either way, though, with Google Gemini, ChatGPT, and Claude all pushing what we've come to expect from AI tools and many of them becoming more personal, we'll have to wait and see if Apple's actually smarter Siri is worth the wait, and how it prepares us for the other big rumor: a Siri chatbot inside iOS 27. Time will tell, but Siri's major upgrade is still a long time coming. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course, you can also follow TechRadar on YouTube and TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[33]
Apple Spent Years Downplaying AI Chatbots. Now Siri Is Becoming One
Its position on what it thought customers wanted from artificial intelligence wasn't exactly subtle. In an interview just last year, Craig Federighi, Apple's senior vice president of software engineering, said the company never wanted to send users "off into some chat experience in order to get things done." Apple's philosophy, he argued, was that AI should be quietly woven throughout all of a device's features. I suppose that makes sense. Apple is uniquely positioned to offer a personalized version of AI that surfaces information based on your context and the apps you regularly use. The thing is, people actually like using chatbots. There's a reason ChatGPT is the fastest-growing app ever. Apple seems to have figured that out and changed its position.
[34]
Apple's AI-Powered Siri Chatbot Could Run on Google Cloud
Apple's Siri chatbot is said to be integrated across all of Apple's OS Apple is reportedly in talks with Google to use its cloud service infrastructure to power a new version of Siri. As per the report, the Cupertino-based tech giant is planning to introduce a chatbot version of Siri, which will be more capable and feature-rich. This is said to be separate from the artificial intelligence (AI)-powered Siri that is said to arrive this spring alongside the iOS 26.4 update. The iPhone maker is also said to use Google's AI chips for the Siri chatbot. Siri Chatbot Could Run on Google Cloud According to a Bloomberg report, the tech giant could look beyond its on-device chipsets and Private Cloud Compute servers for a future version of Siri that will act like a chatbot. Details about this Siri chatbot recently surfaced, and is said to be able to have back-and-forth conversations, image generation, on-screen content analysis, and completing certain on-device tasks. However, it appear that the iPhone maker is planning to use Google's cloud services to power the AI-powered Siri chatbot. The Bloomberg report claims that Apple is currently in discussions with the company to run the future version of Siri's compute on Google Cloud. This would make sense since it is said that the chatbot's brain will be a custom Gemini AI model, which is as performant as Gemini 3. One of the reasons Apple is reportedly considering using third-party infrastructure to power Siri is because of the ongoing global RAM shortage. It is said that the company has not secured enough memory chips to build out its AI data centres, and is forced to rely on cloud partners to provide the processing bandwidth. This would also mean that Siri will be powered by Google's tensor chips, which power its data centres. Apple is reportedly also considering partnering with multiple cloud providers in different markets. This decision is said to be partially influenced by government policies, and partially to reduce reliance on a single company. The report claims that in China, the tech giant could collaborate with Alibaba's cloud services to power certain Apple Intelligence features. The company's approach of focusing on partnerships to fill the gaps in its AI ambitions is said to come from Craig Federighi, Apple's Senior Vice President of Software Engineering. According to The Information, the executive has taken a cautious approach towards AI, limiting the spendings and reckless hiring of talent to build out its AI features.
[35]
The new Siri chatbot may run on Google servers, not Apple's
Nestled in Bloomberg's reporting earlier on Apple's plans to revamp Siri as a chatbot with iOS 27, was an interesting tidbit on a possible change in the company's cloud strategy. Specifically, Mark Gurman says Apple and Google are discussing running the next-generation Siri models directly on Google's servers, not Apple's. With iOS 26.4, Apple is set to launch the first new LLM Siri features, using models running in Private Cloud Compute based on an older generation of Gemini. But the Siri chatbot coming in the iOS 27 cycle will apparently be based off the newer, smarter, Gemini 3 models. Running these latest-gen models seemingly requires higher performance servers than what Apple can deliver right now through its own Private Compute cloud infrastructure ... As such, Apple user conversations with the new Siri chatbot experience would be trafficked through Google's (much larger) cloud infrastructure for every request. That would represent a big departure philosophically for Apple, which talked up Private Cloud Compute as a way to extend the privacy bubble of the personal data on your iPhone into Apple's cloud. However, the Private Cloud Compute idea was designed under a prior regime. When Apple was promoting Private Cloud Compute at WWDC 2024, it had not conceived needing to license Gemini models from Google at all. As such, it's not unthinkable that other elements of its AI plan may be in flux. While Apple would probably like to keep as much of its processing running on its cloud, rather than a third-party, the new heads of Siri -- Craig Federighi and former Vision Pro exec Mike Rockwell -- seem to be choosing practicality over idealism. There is clearly pressure to 'catch up' and deliver modern Siri features to users as soon as possible, even if that means reneging on some parts of their prior published vision. After all, the company was also previously downplaying the premise of a chatbot experience altogether, but plans change and Apple is clearly responding to the proven market success of services like ChatGPT. It's also probably safe to assume that Apple would negotiate the Google cloud server arrangement such that sensitive user data is not logged or retained by Google, sufficiently siloed from Google's advertising and data collection units. Behind the scenes, parts of iCloud have depended on providers like Amazon Web Services and Google Cloud Platform since its inception. These third-party data storage partners enable features like iCloud Photos to operate and scale, all the while Apple holds the encryption keys. Back in 2021, it was reported that Google's cloud hosted an enormous 8 exabytes of iCloud content.
[36]
Apple to unveil Gemini-powered Siri update in February: Report
The update, which is central to the company's partnership with Google, will leverage the latter's Gemini AI models as Apple looks to evolve Siri beyond being a traditional voice-based assistant. Apple is set to announce a new version of Siri in the second half of February, according to a Bloomberg report published on Sunday. The update, which is central to the company's partnership with Google, will leverage the latter's Gemini AI models as Apple looks to evolve Siri beyond being a traditional voice-based assistant. Apple plans to equip Siri with advanced genAI capabilities, enabling it to perform tasks by drawing context from users' personal data as well as information displayed on their screens. The move aligns with the company's broader push into generative artificial intelligence under its Apple Intelligence initiative, a strategy that was prominently outlined at the Worldwide Developers Conference (WWDC) 2024. The February release is expected to precede a more substantial Siri overhaul that Apple plans to announce at the WWDC in June. The report added that this later version of Siri will be more conversational, similar to chatbots such as ChatGPT, and could operate directly on Google's cloud infrastructure. Earlier this month, Apple cited Gemini as the most capable foundation for delivering new user experiences. Reports of talks between the two companies first surfaced in August last year, when a Bloomberg report claimed Apple was exploring the use of a custom Gemini model for Siri. While the terms of the deal were not disclosed, it reported that Apple was planning to pay about $1 billion per year to utilise Google's AI technology. ET reported that Apple had been delaying its major Siri AI upgrade, acknowledging it would take longer than expected to deliver promised features. This also comes as Apple has seen significant C-suite exits since last year, particularly in its AI and design teams. xAI chief Elon Musk was, however, not pleased by the AI partnership between Apple and Google. "This seems like an unreasonable concentration of power for Google, given that [they] also have Android and Chrome," Musk wrote in a post on X. He serves as CEO of xAI, the company behind Grok, a direct competitor to Google's Gemini AI.
[37]
iOS 27 May Bring Siri AI Chatbot with ChatGPT-Style Conversations
Siri's been around since 2011, but let's be honest, it's never quite lived up to the hype. You ask it to do something simple, and half the time you get a web search instead of an actual answer. That might finally change this fall. Apple's planning to overhaul Siri into a full-fledged AI chatbot with iOS 27, according to the latest report from Bloomberg's Mark Gurman. The update's expected to launch in September 2026. Instead of the current voice assistant setup, you'd get ChatGPT-style conversations where you can type or talk back and forth naturally. The new Siri AI chatbot will be powered by a custom Google Gemini model called Apple Foundation Models v11. It'll be way smarter than what we have now. What the new Siri actually does The upgraded assistant will dig deep into your iPhone, iPad, and Mac apps. It'll handle tasks like analyzing what's on your screen, creating content, and using your personal data to help out. Think asking it to edit a photo with specific changes or plan your entire day based on your calendar and emails. It can spot scheduling conflicts automatically. You won't need to switch between apps or use rigid commands anymore. There's a smaller update hitting first with iOS 26.4 this spring. That will bring some Gemini-based improvements, but it keeps the current Siri interface. The real shift happens in the fall when the Siri AI chatbot launches with iOS 27. Apple's expected to show off the new chatbot at WWDC in June before the September launch. The company's emphasizing privacy by running everything through Apple's own cloud infrastructure. That sets it apart from standalone apps like ChatGPT that store your conversations on their servers. This is Apple playing catch-up after Apple Intelligence launched in 2024 to mixed reviews. The partnership with Google's Gemini models shows Apple's willing to work with competitors to get the AI right. If the rumors pan out, Siri might actually become useful for once.
[38]
Report: Apple does about-face on Siri chatbot -- and it might compete directly with ChatGPT and Google
Apple promised us this was never the plan...until we guess, it became the plan: A new report from Apple soothsayer Mark Gurman says Apple will build an all-new Siri chatbot for iOS 27, one that looks and works differently but should remind most of us of an AI chatbot like ChatGPT. To say this is a possible about-face is an understnatment and it's just one piece of the whirlwind that is apparently now Siri development. Remember that pledge to deliver in early 2026 the Apple Intelligence Siri we were promised almost two years ago? That's apparently still happening, Gurman claims, via an iOS 26.4 update. If so, Siri will, with your permission, be able to see your screen, understand your actions in apps and other systems, becoming a more proactive and smarter assistant. It's unclear if that ability is still what Apple meant when it announced earlier this month that it would use Google Gemini models to build its Foundation Models and enable the promised Siri. The only reason to doubt that iOS 26.4 will contain that is the timeline. For Apple to deliver that update by late May (before it unveils iOS 27 in June), it probably must have been working with Google since late last year. Again, Apple is not commenting, so we have no insight into the development process. Siri, we have questions What will power this potential Siri AI chatbot is also a question mark. Will it still be tapping into Gemini foundation models and therefore be more or less a skin for Gemini? That seems unlikely, or at least too much of an admission by Apple that it was never up to the task. But Gurman claims that not only will the iOS 26.4 update use those promised Gemini models (albeit built into Apple's Foundation Models), but the chatbot update (currently called Project Campos, per Gurman) will also rely in part on fresher, more powerful Gemini models and even use Google cloud servers. That last bit sounds bonkers because it erases the security promise of Apple's Private Compute Cloud. Sure, since iOS 18, it's transparently sent some queries to either ChatGPT or Google, but to send potentially all Siri Chatbot prompts there is tantamount to abandoning a core privacy promise. The problem with this Gurman theory is that it contradicts what Federighi told me almost a year ago at WWDC 2025 when we asked about what happened to the development of that long-promised Siri update. "This wasn't about just building a chatbot," said Federighi last year, adding. "...That was never the goal, and it remains not our primary goal." A hint of truth So who are we to believe? Gurman or Apple executives. To be fair to Gurman, the only recent comment we have from Apple is that they are partnering with Google (Gurman claims it's for a $1 billion-a-year price tag, paid to Google). So it stands to reason that there is some truth here. I agree with Apple that Google's Gemini provides some of the best models, and it is clear that Apple can't get there on its own, but the handing of privacy keys to a chief computer, one who already rules so much of our data, looks pretty anti-Apple. That's why I'll take all these rumors with a massive helping of salt and await Apple's next big AI reveal. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course, you can also follow TechRadar on YouTube and TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[39]
Apple's Bold Move: Siri Gets Smarter with Google's Gemini AI
Apple is making a pivotal move to enhance Siri, its renowned voice assistant, by integrating Google's Gemini AI model. This decision represents a significant departure from Apple's tradition of relying solely on proprietary solutions. By adopting Gemini's advanced artificial intelligence capabilities, Apple aims to address Siri's long-standing limitations and elevate it to compete with leading AI-powered assistants like Google Assistant and ChatGPT. This development has far-reaching implications for the future of digital assistants and how they integrate into your daily life. The video below from ZONEofTECH gives us more details on what to expect from Siri 2.0. The Rise and Stumble of Siri When Siri was introduced in 2011, it transformed the way people interacted with technology, offering a glimpse into the potential of voice-activated assistants. However, over the years, Siri has struggled to keep pace with advancements in artificial intelligence, leaving users increasingly dissatisfied. Some of Siri's most notable challenges include: * Limited contextual understanding: Siri often fails to grasp the nuances of user queries, leading to incomplete or irrelevant responses. * Weak natural language processing: Its ability to interpret and respond to conversational language has lagged behind competitors. * Inability to handle complex queries: Siri struggles with multi-layered or intricate requests, limiting its utility in more demanding scenarios. These shortcomings have allowed competitors like Google Assistant and Amazon Alexa to dominate the market, offering more robust and versatile solutions. As user expectations for digital assistants continue to rise, Apple has recognized the need for a fantastic approach to keep Siri relevant. Partnering with Google: A Strategic Shift To address these challenges, Apple has entered into a multi-year partnership with Google to integrate the Gemini AI model into Siri. This collaboration marks a strategic shift for Apple, allowing the company to use Google's expertise in artificial intelligence while maintaining control over Siri's design and user experience. A key aspect of this partnership is Apple's unwavering commitment to privacy. All Siri requests will be processed on Apple's private cloud infrastructure, making sure that your data remains secure and inaccessible to external entities. This approach aligns with Apple's broader emphasis on user privacy as a core value. However, this collaboration comes with a significant financial commitment. Apple is reportedly paying Google $1 billion annually for access to Gemini's innovative technology. Despite the cost, this investment underscores Apple's determination to close the gap between Siri and its competitors. Why Google's Gemini and Not OpenAI? Apple's decision to partner with Google rather than OpenAI reflects both strategic and practical considerations. Gemini is widely regarded as one of the most advanced AI platforms available today, excelling in areas such as natural language processing, contextual understanding, and conversational AI. OpenAI, on the other hand, has shifted its focus toward developing its own hardware and software ecosystems, reportedly declining further collaboration with Apple. This left Google as the most logical partner to help Apple enhance Siri's capabilities. By choosing Gemini, Apple gains access to a proven AI model that can address Siri's weaknesses and deliver a more competitive user experience. What's Next for Siri? The integration of Google's Gemini AI into Siri will be implemented in a phased rollout, making sure a smooth transition and allowing users to gradually experience the improvements. Key milestones include: * April 2026: The release of iOS 26.4 will introduce foundational Gemini-powered features, such as enhanced natural language understanding and improved contextual awareness. * September 2026: With the launch of iOS 27, Siri will gain more advanced capabilities, including conversational AI, document summarization tools, and enhanced task management features. In the longer term, Apple may rely on Google's servers for advanced computing power, further expanding Siri's functionality and responsiveness. This phased approach ensures that users can adapt to the changes while benefiting from incremental improvements. Apple's Dual AI Strategy While the partnership with Google represents a significant step forward, Apple is not abandoning its in-house AI development efforts. The company continues to invest heavily in proprietary AI technologies, aiming to achieve long-term independence and innovation. For now, Gemini serves as a bridge, allowing Apple to deliver a more competitive Siri experience while its internal AI models mature. This dual strategy allows Apple to balance immediate improvements with its broader vision for the future of artificial intelligence. What This Means for You For users, the integration of Gemini into Siri promises a smarter, more intuitive digital assistant capable of handling a wide range of tasks. Imagine being able to: * Summarize lengthy documents: Siri could condense complex information into concise, actionable insights. * Assist with technical projects: From coding to troubleshooting, Siri could provide valuable support for professional tasks. * Plan intricate schedules: Siri could seamlessly coordinate multiple events, deadlines, and priorities. These enhancements will transform Siri from a basic assistant into a sophisticated digital companion, making it an indispensable tool for productivity and convenience. Strategic Implications for the Tech Industry Apple's collaboration with Google highlights a pragmatic approach to innovation, demonstrating the value of using external expertise while staying true to core principles like user privacy. This partnership also underscores the complex relationship between Apple and Google -- two fierce competitors that occasionally collaborate to achieve mutual benefits. By integrating Gemini, Apple not only addresses Siri's weaknesses but also strengthens its position in the rapidly evolving field of conversational AI. This move signals a broader trend in the tech industry, where companies are increasingly prioritizing user experience and practical functionality over rigid independence. The Future of Digital Assistants The integration of Google's Gemini AI into Siri represents a turning point in Apple's approach to artificial intelligence. By embracing innovative technology and prioritizing user needs, Apple is transforming Siri into a more capable and responsive assistant. For users, this evolution means a Siri that is better equipped to meet the demands of modern life, from managing complex tasks to providing insightful answers. As this partnership unfolds, it will be fascinating to see how Apple's AI journey shapes the future of digital assistants and redefines the way you interact with technology. Advance your skills in AI model integration by reading more of our detailed content. Source & Image Credit: ZONEofTECH
[40]
Apple To Unveil The Revamped, Gemini-Backed Siri In February 2026, Scales Back 'World Knowledge Answers' And The Safari AI Overhaul
Apple might unveil the much-anticipated, Gemini backed Siri as soon as next month, as per the tidbits gleaned from the latest iteration of Mark Gurman's Power On newsletter. Mark Gurman: Apple might reveal the Gemini-backed Siri by late February 2026 We noted in a dedicated post recently that Apple has formally selected Google's Gemini to power the next generation of its on-device Foundation Models. The tie-up will also allow Apple to launch a revamped version of Siri, likely with the upcoming iOS 26.4 update, bringing the much-delayed in-app actions, personal context awareness, and on-screen awareness to its bespoke voice assistant, enabling a wide variety of agentic actions across apps, based on personal data and on-screen content. To do so, Apple is planning to deploy a gigantic 1.2-trillion-parameter custom Gemini AI model on its cloud servers to power AI features under the ambit of its Private Apple Intelligence - where relatively simple AI tasks are performed by using on-device models and the computational resources of the device itself, while the more complex tasks are offloaded to Apple's private cloud servers using encrypted and stateless data for subsequent inference. Now, Bloomberg's Mark Gurman has declared that Apple might unveil the revamped, Gemini-backed Siri as soon as next month: "Today, Apple appears to be less than a month away from unveiling the results of this partnership. The company has been planning an announcement of the new Siri in the second half of February, when it will give demonstrations of the functionality." According to Gurman, the unveil might take the form of a grand event or a smaller media briefing. The upcoming Apple Creator Studio briefing might provide one such avenue to Apple. Of course, the revamped Siri is expected to ship with the iOS 26.4 update, which is expected to enter its beta stage in February and roll out publicly in March or early April 2026. Interestingly, the Apple Foundation Models that now run on the 1.2-trillion-parameter custom Gemini LLM - which is hosted on private Apple Cloud servers - have been dubbed the Foundation Models version 10, making it seem as if the underlying technology is entirely Apple's. Even so, with the iOS 27 update, Apple is planning to launch a dedicated Siri chatbot that will run on Google's own TPUs and cloud infrastructure, possibly leased by Apple. According to Gurman, the Siri chatbot will be baked into Apple's software rather than debuting as a standalone app, allowing it to search the web, generate content, including images, provide coding assistance, summarize and analyze information, as well as upload files. It will be able to use personal data to complete tasks and sport a substantially improved search feature. Apple is also designing a feature that will let the Siri chatbot view open windows and on-screen content, as well as adjust device features and settings. In his latest Power On newsletter, Mark Gurman notes that the iOS 27 will arrive in its beta form this summer. The chatbot Siri will reportedly leverage a much more advanced version of Google's Gemini model, known internally as Apple Foundation Models version 11. Gurman adds that "the model is expected to be competitive with Gemini 3 and significantly more capable than one supporting the iOS 26.4 Siri." Meanwhile, Apple has scaled back some of its other ambitious AI-related projects such as the World Knowledge Answers - aiming to compete with the likes of ChatGPT and Perplexity by providing accurate and snappy answers to general queries - and an AI-driven overhaul of the Safari browser, which was to introduce features such as "assessing the trustworthiness of documents and data, and cross-referencing information across multiple sources." Similarly, its planned overhaul of the Apple Health is also in flux. Finally, Gurman notes that while Apple continues to develop its own on-device AI models, its focus has clearly shifted towards the more powerful Gemini models that are being deployed on the cloud. As such, "Apple plans to deploy higher-performance, in-house servers next year to support those efforts." Follow Wccftech on Google to get more of our news coverage in your feeds.
[41]
Apple Could Turn Siri Into an AI Chatbot to Rival OpenAI, Google
* The advanced Siri is said to be released before that * Apple's Siri chatbot will reportedly be powered by Gemini * The new Siri is said to be integrated across all of Apple's OS Apple is reportedly planning to turn its voice assistant Siri into an artificial intelligence (AI) chatbot. As per the report, the Cupertino-based tech giant wants to enter the generative AI chatbot race and compete with companies such as OpenAI, Google, Anthropic, and xAI. The chatbot transformation is not just a redesign process; the company will reportedly also add new features and capabilities, making it a far more capable device assistant than what Siri is today. This version is said to be released in the second half of the year. Apple Reportedly Wants to Turn Siri Into an AI Chatbot According to a Bloomberg report, the tech giant is currently working on a project, codenamed Campos, which aims to turn Siri into an AI chatbot. Citing unnamed people familiar with the matter, the publication claimed that the chatbot will be integrated across iPhone, iPad, and Mac devices via respective operating systems (OS). Even after this transformation, Siri will reportedly continue to perform all its current duties and can be activated via the same commands and controls. However, it is said that the chatbot version of Siri will be able to take on more tasks as well. Some of the claimed features include having back-and-forth conversations, generating images, and analysing on-screen content. It is also said to get better at web search functions. It will reportedly support both voice and text commands. Another reported capability includes analysing on-device information and providing answers based on natural-language queries. The publication says it will be possible to give Siri a vague description of an image, and it will be able to scan the Photos app to find the relevant result. This will reportedly be powered by an AI model provided by Google, which is said to be similar to Gemini 3 in terms of performance. Notably, this version of Siri will reportedly not be available before iOS 27 and the adjacent updates for other devices. Meanwhile, the advanced Siri with AI capabilities is expected to arrive this spring with the iOS 26.4 update. However, it is interesting that the company is building an AI chatbot after Craig Federighi, Apple's Senior Vice President of Software Engineering, had said last year that it would not build a "bolt-on chatbot." The executive had highlighted that a chatbot did not align with the company's vision of AI technology.
[42]
Apple reportedly replacing Siri interface with actual chatbot experience for iOS 27
Apple reportedly plans to make a major pivot with Siri in iOS 27. According to Mark Gurman at Bloomberg, Apple is developing its first chatbot with a version of Siri that will replace the existing experience. From Bloomberg's latest report, detailing a major directional shift for Apple's AI efforts: Apple Inc. plans to revamp Siri later this year by turning the digital assistant into the company's first artificial intelligence chatbot, thrusting the iPhone maker into a generative AI race dominated by OpenAI and Google. The chatbot -- code-named Campos -- will be embedded deeply into the iPhone, iPad and Mac operating systems and replace the current Siri interface, according to people familiar with the plan. Users will be able to summon the new service the same way they open Siri now, by speaking the "Siri" command or holding down the side button on their iPhone or iPad. The key distinction between Siri as it has existed since 2011 and this newly reported version is that it will adopt the chatbot style of interaction popularized by ChatGPT. The unannounced Siri overhaul will reportedly be revealed at WWDC in June as the flagship feature for iOS 27 and macOS 27. Its release is expected in September when Apple typically ships major software updates. While Apple plans to release an improved version of Siri and Apple Intelligence this spring, that version will use the existing Siri interface. The big difference is that Google's Gemini models will power the intelligence. With the bigger update planned for iOS 27, the iOS 26 upgrade to Siri and Apple Intelligence sounds more like the first step to a long overdue modernization. Gurman reports that the major Siri overhaul will "allow users to search the web for information, create content, generate images, summarize information and analyze uploaded files" while using "personal data to complete tasks, being able to more easily locate specific files, songs, calendar events and text messages." Apple doesn't grant existing chatbots this level of system access. Gurman shares additional details of how Apple's OS will integrate with the "new new" Siri: More significantly, Siri will be integrated into all of the company's core apps, including ones for mail, music, podcasts, TV, Xcode programming software and photos. That will allow users to do much more with just their voice. For instance, they could ask Siri to find a photo based on a description of its contents and edit it with specific preferences -- like cropping and color changes. Or a user could ask Siri within the email app to write a message to a friend about upcoming calendar plans. People are already familiar with conversational interactions with AI, and Bloomberg says the bigger update to Siri will be support both text and voice. Siri already uses these input methods, but there's no real continuity between sessions. Publicly, Apple has dismissed making a "bolt-on chatbot on the side" as an interface for Siri and Apple Intelligence. For that reason, today's report will likely be a relief for iPhone users disappointed at Apple's slow AI adoption. You can read the report in full here.
[43]
Apple to Turn Siri Into AI Chatbot Powered by Google's Gemini | PYMNTS.com
The chatbot will be added to the operating systems of the iPhone, iPad and Mac; will be integrated into core apps such as mail, music and photos; and will be activated by the user saying "Siri" or holding down the side button on the smartphone or tablet, Bloomberg reported Wednesday (Jan. 21), citing unnamed sources. The new Siri will be able to search the web, create content, generate images and analyze files uploaded by the user, according to the report. The custom AI model powering the AI chatbot will be provided by Google, while the user interface will be designed by Apple, the report said. Before launching the new chatbot version of Siri, Apple will release an updated version of the current digital assistant, per the report. Apple did not immediately reply to PYMNTS' request for comment. Apple and Google announced Jan. 12 that they had formed a partnership and that the next iteration of Apple's Foundation Models will be based on Google's Gemini and cloud tech. Those models will help power new features for Apple Intelligence -- the company's AI system -- including a more personalized version of its AI assistant, the companies said. "After careful evaluation, Apple determined that Google's AI technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users," their announcement said. "Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards." It was reported in June that Apple was considering using AI models from another company, rather than its own in-house models, to power Siri. That report said adopting third-party AI models could allow Apple to offer Siri features that would be competitive with the AI assistants available on Android smartphones. Earlier in June, it was reported that Apple aimed to bring an AI-powered upgrade of Siri to market in spring 2026, after facing delays and failing to meet its original goal of fall 2024.
[44]
How iOS 26.4 Finally Fixes the iPhone's Biggest Flaws
Apple is set to transform its virtual assistant, Siri, with the upcoming releases of iOS 26.4 and iOS 27. These updates aim to address long-standing user concerns by introducing advanced artificial intelligence (AI) features, enhanced personalization, and seamless integration across Apple's ecosystem. With these advancements, Siri is positioned to become a more competitive and capable alternative to AI-driven assistants like ChatGPT and Google Gemini. iOS 26.4: Enhanced Personalization and Ecosystem Integration Scheduled for release in late March 2026, iOS 26.4 represents a pivotal step in Apple's strategy to redefine Siri's capabilities. Central to this update is the introduction of Apple Intelligence, a proprietary AI framework designed to make Siri more intuitive and responsive. By using innovative natural language processing (NLP) and advanced personalization algorithms, Siri will be able to handle increasingly complex tasks, such as: * Creating reminders based on your calendar events and location. * Generating playlists tailored to your unique listening habits. * Providing context-aware suggestions within apps to streamline your workflow. A key feature of iOS 26.4 is its improved integration with third-party applications. This enhancement allows Siri to perform tasks such as booking appointments, sending messages, or analyzing data directly within supported apps. Additionally, the update emphasizes secure on-device data processing, making sure that your personal information remains private while delivering faster and more accurate responses. The update also introduces performance optimizations, making sure that Siri operates efficiently without compromising battery life or device performance. These improvements reflect Apple's commitment to creating a virtual assistant that is both powerful and user-friendly. iOS 27: Siri's Evolution into a Conversational Chatbot Building on the advancements of iOS 26.4, iOS 27 is set to elevate Siri's functionality to unprecedented levels. Expected to debut in beta at WWDC 2026, this update will transform Siri into a chatbot-like assistant capable of engaging in dynamic, context-aware conversations. This evolution aims to make Siri more intuitive, versatile, and indispensable in daily life. The iOS 27 update introduces several new features, including: * Analyzing on-screen content to provide relevant and actionable suggestions. * Automatically adjusting device settings based on your preferences and usage patterns. * Accessing app data, such as Photos, Messages, and Music, to deliver personalized recommendations. For instance, Siri could suggest edits to a photo you are viewing, recommend a playlist based on your recent listening activity, or provide tailored responses based on your ongoing conversations. Cross-device synchronization ensures that Siri delivers a consistent and seamless experience across your iPhone, iPad, and Mac, further enhancing its utility within the Apple ecosystem. In addition to these features, iOS 27 focuses on optimizing performance and battery efficiency, making sure that the enhanced capabilities do not compromise the overall user experience. This balance between innovation and practicality underscores Apple's dedication to creating a virtual assistant that adapts to your needs without adding unnecessary complexity. Apple's Commitment to AI-Driven Assistance The updates in iOS 26.4 and iOS 27 reflect Apple's broader vision of positioning Siri as a leader in the AI-driven virtual assistant market. By integrating advanced AI technologies and prioritizing user-centric design, Apple aims to address common frustrations with Siri's current limitations while setting a new standard for virtual assistants. Apple's focus on personalization and secure data processing highlights its commitment to creating a virtual assistant that not only meets but exceeds user expectations. The emphasis on cross-device synchronization and seamless integration across the Apple ecosystem further reinforces Siri's role as a central component of the user experience. Looking Ahead: The Future of Siri The iOS 26.4 update is currently in beta testing and is expected to be publicly released by late March 2026. Meanwhile, the beta version of iOS 27, featuring the fantastic Siri chatbot, is anticipated to launch in June 2026 following its announcement at WWDC. These updates represent a significant leap forward for Siri, paving the way for a more intelligent, responsive, and user-focused virtual assistant. * iOS 26.4: Introduces enhanced personalization, improved third-party app integration, and secure on-device data analysis. Expected release: March 2026. * iOS 27: Features a chatbot-like Siri with advanced AI capabilities, cross-device synchronization, and optimized performance. Expected beta release: June 2026. By embracing advanced AI technologies and focusing on delivering a seamless user experience, Apple is not only redefining Siri but also reshaping how users interact with their devices. These updates mark the beginning of a new era for virtual assistants, where intelligence, personalization, and integration take center stage, offering a glimpse into the future of digital interaction. Browse through more resources below from our in-depth content covering more areas on iOS 26.4 Update. Source & Image Credit: iReviews
[45]
Siri 2.0 vs Siri 3.0: The iOS 27 Upgrade That Changes Everything
Apple is poised to reshape the landscape of AI assistants with the upcoming releases of Siri 2.0 and Siri 3.0 in 2026. These updates represent a significant evolution, transforming Siri into a more intelligent, versatile, and competitive assistant. By integrating advanced chatbot technology and using innovative AI models, Apple aims to establish Siri as a prominent player in the rapidly expanding generative AI market. This strategic move underscores Apple's commitment to innovation and its ambition to redefine the role of AI in everyday life. The video below from iDeviceHelp gives us more details on what Apple has planned. Siri 2.0: Smarter, Faster, and More Capable Scheduled for release with the iOS 26.4 update in spring 2026, Siri 2.0 marks the first phase of Apple's ambitious AI roadmap. This update focuses on enhancing Siri's ability to handle complex tasks, deliver context-aware responses, and provide a more seamless user experience. By using Google Gemini Foundation models hosted on Apple's servers, Siri 2.0 introduces a range of improvements designed to streamline daily interactions and increase productivity. * Expanded Knowledge Base: Siri 2.0 will access a broader and more diverse range of information, allowing it to answer complex and nuanced questions with greater accuracy. * Improved Task Management: The assistant will excel at managing multi-step tasks, such as scheduling meetings, setting reminders, and organizing to-do lists, making it an indispensable tool for busy users. * Conversational Memory: Siri will retain information from past interactions, allowing for more personalized and contextually relevant conversations over time. * File Retrieval: Users will benefit from Siri's ability to locate and retrieve files stored across their devices, simplifying workflows and saving time. These enhancements reflect Apple's dedication to creating a more intuitive and proactive AI assistant. Siri 2.0 not only improves functionality but also sets the stage for the fantastic capabilities that Siri 3.0 will introduce later in the year. Siri 3.0: A Leap into Generative AI Arriving with iOS 27 in fall 2026, Siri 3.0 represents a new evolution in Apple's AI assistant technology. Internally codenamed "Campos," this update will transform Siri into a sophisticated chatbot, designed to rival leading AI platforms such as OpenAI's ChatGPT and Google's Gemini-powered assistants. Hosted on Google's cloud infrastructure, Siri 3.0 will use advanced computational power to deliver innovative generative AI capabilities. Key features of Siri 3.0 include: * Enhanced Conversational Abilities: Siri will engage in more natural, human-like dialogues, making it a more effective tool for both casual interactions and professional tasks. * Generative AI Integration: By using Google Gemini models, Siri will generate creative, contextually relevant responses tailored to individual user needs, enhancing its versatility and utility. * Competitive Positioning: With its advanced chatbot capabilities, Siri 3.0 will directly challenge leading AI platforms, signaling Apple's strategic shift toward generative AI and its commitment to staying at the forefront of technological innovation. This update will redefine Siri's role, transforming it from a basic voice assistant into a sophisticated conversational AI capable of meeting the diverse demands of modern users. Siri 3.0's generative AI capabilities will empower users with more dynamic and personalized interactions, making it a valuable tool in both personal and professional contexts. Strategic Rollout and Industry Implications Apple's rollout of Siri 2.0 and Siri 3.0 follows a carefully planned timeline designed to ensure a smooth transition and maximize user adoption. Siri 2.0 is expected to launch in March or April 2026, while Siri 3.0 will debut alongside iOS 27 in September 2026. A beta version of iOS 27 is anticipated in June 2026, coinciding with Apple's Worldwide Developers Conference (WWDC). This phased approach reflects Apple's methodical strategy for integrating advanced AI technologies into its ecosystem. The adoption of chatbot technology signals a significant shift in Apple's AI strategy. Historically cautious about embracing generative AI, Apple is now using partnerships with Google and making substantial investments in this area. This pivot positions Apple as a serious contender in the rapidly evolving AI market, aligning its offerings with the growing demand for intelligent, conversational AI solutions. By integrating generative AI into Siri, Apple is not only enhancing its ecosystem but also setting a new standard for what users can expect from AI assistants. The Future of AI Assistants The arrival of Siri 2.0 and Siri 3.0 marks a pivotal moment in Apple's AI journey. These updates will elevate Siri from a basic assistant to a sophisticated, chatbot-powered AI capable of addressing the complex needs of modern users. By integrating advanced generative AI technologies and expanding Siri's capabilities, Apple is poised to redefine the role of AI assistants in everyday life. As iOS 27 approaches, users can anticipate a more intelligent, intuitive, and indispensable Siri, offering smarter interactions and tailored solutions that enhance both productivity and convenience. Dive deeper into iOS 27 with other articles and guides we have written below. Source & Image Credit: iDeviceHelp
[46]
Apple stock spikes after report of Siri revamp into chatbot By Investing.com
Investing.com -- Apple Inc. (NASDAQ:AAPL) stock spiked as much as 1.6% Wednesday afternoon following a Bloomberg News report that the company plans to transform Siri into a built-in chatbot, marking its entry into the generative AI race currently led by OpenAI and Google. According to the report, Apple is developing a chatbot codenamed "Campos" that will be deeply integrated into iPhone, iPad, and Mac operating systems, replacing the current Siri interface. Users will be able to access the new service using the same methods they currently use for Siri, including voice commands or holding down device buttons. The new chatbot will reportedly offer significantly enhanced capabilities compared to the current Siri, featuring a chat-like interface and conversational abilities similar to ChatGPT or Google's Gemini. This represents a major upgrade from Siri's current functionality, which lacks the back-and-forth interaction capabilities of competing AI assistants. Bloomberg's sources indicated that while a non-chatbot update to Siri is planned for iOS 26.4 in the coming months, the full chatbot capabilities are expected to be unveiled at Apple's Worldwide Developers Conference in June, with a release scheduled for September. The chatbot, which will support both voice and text input, is reportedly the primary new addition to Apple's upcoming operating systems, including iOS 27, iPadOS 27, and macOS 27, which are otherwise focused on performance improvements and bug fixes rather than major changes.
[47]
Forget ChatGPT: iOS 27 is Turning Siri into the Ultimate AI Assistant
Apple is poised to reshape the virtual assistant landscape with a significant upgrade to Siri, transforming it into a fully conversational AI chatbot. Internally referred to as "Campos," this enhanced version of Siri is set to debut with iOS 27. Designed to rival advanced AI platforms such as ChatGPT and Google Gemini, the new Siri will use innovative natural language processing (NLP) and advanced task management capabilities. This evolution aims to deliver a more intuitive, personalized, and seamless user experience, redefining how users interact with Apple's ecosystem. The video below from MacRumors gives us more details on A Shift from Commands to Conversations The transformation of Siri from a command-based assistant to a conversational AI chatbot marks a pivotal moment in Apple's AI journey. Unlike its current functionality, which relies on predefined commands, the new Siri will engage in dynamic, human-like conversations. It will be capable of reasoning, managing follow-up questions, and executing multi-step tasks with ease. For example, envision asking Siri to organize a weekend getaway. Instead of merely offering links or basic suggestions, it could book accommodations, recommend activities, and create a detailed itinerary -- all through a natural, flowing dialogue. This shift reflects the increasing demand for AI tools capable of handling complex, context-aware interactions, moving beyond the limitations of simple voice commands. By adopting this conversational approach, Siri aims to become a more versatile and indispensable tool for users. Advanced Features and Ecosystem Integration The updated Siri will introduce a range of advanced features designed to enhance productivity, creativity, and convenience. Among its key functionalities, Siri will be able to: * Conduct web searches and generate written content tailored to user needs. * Create images, summarize documents, and provide concise insights. * Analyze files, assist with coding tasks, and simplify technical workflows. What sets Siri apart from competitors is its deep integration with Apple's ecosystem. By using on-device data, Siri will deliver contextually relevant assistance across apps such as Photos, Mail, Messages, and Music. Whether you need help organizing your photo library, drafting an email, or curating a playlist, Siri will provide seamless support through both voice and text-based interactions. This integration ensures that Siri not only enhances individual tasks but also strengthens the overall functionality of Apple's interconnected suite of products. Privacy-Centric Personalization Apple's unwavering commitment to privacy remains a cornerstone of its AI strategy. While the new Siri will offer personalized experiences, its memory retention capabilities are expected to be carefully limited to safeguard user data. For instance, Siri might remember your favorite music genres or recurring reminders but will avoid storing sensitive information over extended periods. This privacy-first approach could set Siri apart from competitors, addressing growing concerns about data security while still delivering tailored interactions. By striking a balance between personalization and privacy, Apple aims to build trust with users and establish a benchmark for ethical AI practices. This focus on privacy could prove to be a decisive factor in attracting users who prioritize data security. Technical Challenges and Monetization Strategies The advanced capabilities of the new Siri will require substantial computational power, likely relying on server-side processing to handle complex tasks. This raises important questions about scalability and cost management. To address these challenges, Apple may introduce a tiered subscription model, similar to its existing services like iCloud or Apple One. While basic features are expected to remain free, premium functionalities -- such as in-depth document analysis, advanced task automation, or specialized creative tools -- might be offered as part of a paid subscription. This pricing strategy will play a critical role in determining the accessibility and appeal of Siri's new capabilities. For users seeking advanced features, the subscription model could provide a clear pathway to unlocking Siri's full potential. Competitive Positioning and Launch Timeline Apple is expected to unveil the new Siri at its Worldwide Developers Conference (WWDC) in June, with iOS 27 testing to commence shortly thereafter. The official release is anticipated within six months of the announcement. Positioned as a direct competitor to ChatGPT and Google Gemini, Siri's enhanced features could lead Apple to phase out third-party AI integrations on its devices. This strategic move underscores Apple's ambition to establish Siri as a leader in the AI space. By using its ecosystem, Apple aims to deliver a cohesive and superior user experience that sets its offering apart from competitors. However, the company will need to navigate the competitive landscape carefully, making sure that Siri's unique features and privacy-centric approach resonate with users. Transforming the Apple Ecosystem The introduction of a conversational AI chatbot represents a significant milestone for Apple's ecosystem. By embedding advanced AI capabilities into iOS 27, Apple seeks to elevate the functionality of its devices and services, fostering deeper user engagement and loyalty. For users, the new Siri promises to be more than just a virtual assistant. It will become a versatile tool for managing daily tasks, sparking creativity, and accessing information in a more intuitive way. This development has the potential to enhance the overall value of Apple's ecosystem, encouraging users to rely more heavily on its integrated suite of products. By seamlessly integrating AI into its devices, Apple aims to create a more cohesive and efficient user experience. Looking Ahead The launch of Siri's conversational AI chatbot signals a bold step forward in Apple's AI strategy. With its ability to handle complex conversations, execute advanced tasks, and integrate seamlessly with Apple's ecosystem, the new Siri has the potential to redefine the role of virtual assistants in daily life. However, its success will depend on Apple's ability to address key challenges, including privacy concerns, technical scalability, and the implementation of a sustainable monetization model. As competition in the AI space intensifies, Siri's evolution could set a new standard for user interaction, solidifying its position as a cornerstone of Apple's next-generation devices and services. Unlock more potential in Siri and AI by reading the previous articles we have written.
[48]
BREAKING Apple's NEW Siri ChatBot LEAKED
Apple is poised to make a significant impact on the AI landscape with its upcoming Siri chatbot, codenamed "Campos." This advanced AI assistant is set to debut alongside iOS 27, iPadOS 27, and macOS 27, offering a seamless blend of innovative technology and personalized functionality. Built on Google's Gemini AI and cloud infrastructure, the Siri chatbot is designed to compete with platforms like ChatGPT while adhering to Apple's core principles of user experience and privacy. By integrating advanced AI capabilities with Apple's ecosystem, the chatbot aims to redefine how users interact with their devices. The video below from Max Tech gives us more details on what Apple has planned for the new Siri Chatbot. Seamless Integration Across Apple Devices The Siri chatbot will be deeply embedded within Apple's ecosystem, making sure smooth compatibility with iPhones, iPads, and Macs. This integration extends to essential Apple applications such as Mail, Music, and Photos, allowing you to interact with the AI assistant in environments you already use daily. By accessing personal data like files, messages, and calendar events, the chatbot delivers tailored assistance that evolves with your preferences and habits. For instance: - It can suggest playlists based on your listening history, creating a more enjoyable music experience. - It can organize your photos by recognizing faces, locations, and events, making it easier to relive cherished memories. This level of personalization ensures the chatbot becomes an indispensable part of your routine, adapting to your unique needs and preferences over time. Key Features and Functionalities The Siri chatbot introduces a variety of advanced features designed to enhance productivity, convenience, and user engagement. These include: * Hands-Free Navigation: Use voice commands to control your device, allowing effortless multitasking and AI-driven screen reading for enhanced accessibility. * Web Search and Content Creation: Perform web searches, draft emails or documents, and even generate images using AI-powered tools. * Document Analysis: Extract critical insights from files and documents, streamlining decision-making processes for both personal and professional tasks. * Conversational Support: Engage in natural, human-like interactions for casual conversation or emotional support, making the chatbot a versatile companion. * Memory Retention: Enjoy context-aware responses as the chatbot remembers past interactions, allowing for more meaningful and efficient communication. * Proactive Assistance: Receive timely reminders and updates based on real-world factors such as traffic conditions, deadlines, or upcoming events. These features are designed to simplify everyday tasks, whether you're managing your schedule, planning a trip, or creating professional documents. By combining practicality with innovation, the Siri chatbot aims to enhance your productivity and overall user experience. Technology Behind the Chatbot At the heart of the Siri chatbot lies Google's Gemini AI, which uses advanced Tensor Processing Units (TPUs) and cloud infrastructure to deliver robust performance and scalability. This collaboration allows Apple to focus on creating a user-centric experience while using the technical strengths of Google's AI technology. However, Apple is already laying the groundwork to reduce its reliance on third-party technologies. By investing in proprietary AI servers and chips, Apple aims to achieve tighter integration, improved performance, and enhanced privacy. This strategic shift aligns with Apple's long-term vision of building a secure and self-sufficient AI ecosystem, making sure that its products remain at the forefront of innovation while maintaining the company's commitment to user privacy. Subscription-Based Monetization The Siri chatbot is expected to follow a subscription-based pricing model, offering flexible plans to cater to different user needs: * Individual Plan: Priced at $10 per month, this plan is ideal for single users seeking personalized AI assistance. * Family Plan: Available for $20 per month, this option allows multiple users within a household to benefit from the chatbot's features. There is also speculation that the chatbot may be included in Apple One subscription bundles, potentially with a price adjustment. This approach mirrors Apple's existing strategy for premium services, providing advanced features at competitive rates while encouraging users to explore the broader Apple ecosystem. Future Innovations in AI Apple's ambitions in AI extend far beyond the Siri chatbot. One of the most highly anticipated projects is an AI-powered wearable pin, slated for release in 2027. This innovative device will feature dual cameras, speakers, and wireless charging capabilities, offering on-the-go AI assistance in a compact form factor. The wearable pin represents Apple's vision of integrating AI into everyday life, providing users with seamless access to advanced technology wherever they go. Additionally, Apple is making significant investments in developing custom AI servers and chips. By moving away from third-party technologies like Google Gemini, Apple aims to create a more secure and efficient AI infrastructure. This investment underscores Apple's commitment to innovation, privacy, and independence, making sure that its AI ecosystem remains both innovative and user-focused. Redefining the Future of AI Integration Apple's new Siri chatbot represents a pivotal step forward in the integration of AI technology into everyday life. By combining advanced features with personalized functionality, the chatbot is designed to enhance user experience across Apple's devices. Its seamless integration, robust capabilities, and commitment to privacy position Apple as a strong contender in the competitive AI space. As the Siri chatbot prepares to debut at the upcoming WWDC event, it signals a new era of interaction between users and technology. With its focus on convenience, innovation, and security, Apple is not only redefining personalized assistance but also setting the stage for future advancements in AI-powered solutions. Master AI with the help of our in-depth articles and helpful guides.
[49]
Apple may introduce AI powered Siri with iOS 26.4 next month: Eligible devices, features and more
Reports suggest Apple has integrated Google Gemini technology into Apple Intelligence, with an even more advanced, chatbot-style Siri planned for iOS 27. Apple appears to be finally preparing to catch up in the AI race with its much-anticipated Siri update. According to reports, the upcoming Apple update will improve Siri's personalisation and capabilities. The report added that Apple plans to showcase the new Siri experience in the second half of February. While it is unclear whether the company will hold a dedicated launch event or invite members of the media privately, internal plans indicate that the upgrade is nearing completion. The updated Siri is expected to be included in iOS 26.4, which will likely enter beta testing in February before being released to users in March or early April. The update is said to be limited to newer hardware, with support expected for iPhone 15 Pro models and up. According to reports, this version of Siri will focus on deeper personalisation, allowing the assistant to understand and respond to a user's personal data and screen activity. Apple has previously stated that Siri will be able to use context from apps like Mail and Messages to complete tasks more intelligently, such as checking travel details or calendar plans, without users having to spell out each step. Apple first teased this Siri vision at WWDC 2024, but the rollout was delayed due to internal issues. According to the report, development challenges prompted Apple to incorporate Google's Gemini technology into its own Apple Intelligence framework, resulting in faster progress. While Gemini's capabilities will power some aspects of the experience, the assistant will still run on Apple's AI architecture. Multiple reports also claim that iOS 27 will be Apple's next step towards AI. With the upcoming update, the iPhone maker is reportedly aiming to turn Siri into a full-fledged conversation chatbot with capabilities of sustained back-and-forth interactions, similar to ChatGPT or Google's Gemini. If the timeline holds, Apple's long-awaited Siri revamp could finally begin reaching users in the coming months. However, the exact timeline still remains under wraps.
[50]
Apple plans to rebuild Siri as a ChatGPT-style AI chatbot: Report
This chatbot-style assistant is expected to be integrated into iOS 27 and may take centre stage at Apple's Worldwide Developers Conference (WWDC) in June. Apple is reportedly preparing a major overhaul of Siri that could turn its smart assistant into a chatbot similar to ChatGPT. According to Bloomberg's Mark Gurman, the new Siri experience is internally codenamed "Campos" and would support both voice and text inputs. This chatbot-style assistant is expected to be integrated into iOS 27 and may take centre stage at Apple's Worldwide Developers Conference (WWDC) in June. If true, this would mark one of the biggest changes to Siri since it first launched. This shift would also signal a change in Apple's public stance on chatbots. Previously, Apple's senior vice president Craig Federighi said he did not want Siri to become a chatbot. Instead, he said Apple's AI features should be "integrated so it's there within reach whenever you need it." However, growing pressure from rivals appears to be pushing Apple to rethink that approach. Also read: Samsung accidentally reveals smarter Bixby powered by Perplexity AI: Check details AI chatbots from companies like OpenAI, Google, and Anthropic have gained huge popularity over the past year. These chatbots can write, summarise, explain, and hold conversations in ways Siri has struggled to match. Apple's slower progress in this area has become more noticeable as users compare Siri to newer AI assistants. Apple may also be feeling added pressure from OpenAI's plans to move into hardware. That effort is being led by former Apple design chief Jony Ive. A successful AI-focused device from OpenAI could pose a serious challenge to Apple. Also read: DeepMind and Anthropic CEOs warn AI is already impacting junior roles We already know that Apple has been falling behind in the AI race. The company delayed the launch of a more personalised Siri several times and spent last year looking for an AI partner, testing technology from companies like OpenAI and Anthropic. In the end, Apple chose Google's Gemini, a partnership both companies confirmed earlier this month.
Share
Share
Copy Link
Apple is preparing to announce a major Siri upgrade in February, powered by Google Gemini AI models. The update will transform Siri into a conversational AI chatbot capable of accessing personal data and on-screen content. An even bigger overhaul is planned for June at WWDC, where Apple will introduce a ChatGPT-style assistant codenamed Campos for iOS 27.
Apple is set to announce a significant Siri upgrade in the second half of February, marking the first major outcome of the recently confirmed Apple and Google partnership. According to Bloomberg's Mark Gurman, this update will leverage Google Gemini AI models to deliver capabilities that Apple first promised back in June 2024
1
. The enhanced virtual assistant will be able to complete tasks by accessing user data and on-screen content, addressing long-standing criticisms about Siri's limited functionality compared to competitors.
Source: PYMNTS
This February announcement represents just the beginning of Apple's ambitious AI strategy. The company plans an even more substantial reveal at its Worldwide Developers Conference in June, where it will introduce a completely reimagined Siri designed to function as a full-fledged AI chatbot
1
. This version could run directly on Google's cloud infrastructure, signaling a dramatic shift in how Apple approaches AI development.The upcoming Siri upgrade, internally codenamed Campos, will fundamentally change how users interact with Apple devices. Unlike the current command-based system, the new AI chatbot will support both voice and text inputs, enabling more conversational interactions similar to ChatGPT, OpenAI's popular assistant, and other generative AI platforms
2
. This represents a notable reversal from Apple senior vice president Craig Federighi's earlier stance against turning Siri into a chatbot format.
Source: Geeky Gadgets
The conversational AI will be deeply integrated into iOS 27, iPadOS, and macOS 27 operating systems, rather than existing as a standalone app
4
. Users will be able to ask Siri to find photos based on content descriptions, edit them with specific preferences like cropping and color changes, or compose messages about upcoming calendar plans5
. The assistant will also analyze open windows and suggest commands, control device features and settings, and integrate with core Apple apps including Photos, Mail, Apple Music, Podcasts, TV, and Xcode.
Source: Gadgets 360
Apple's path to this announcement hasn't been straightforward. Earlier reports suggested the company struggled to get its AI strategy on track, with delays plaguing the rollout of a more personalized Siri
2
. The company spent considerable time evaluating potential AI partners, testing technology from competitors including OpenAI and Anthropic before ultimately selecting Google Gemini AI as its partner in a multi-year deal announced earlier this month.This partnership comes amid mounting competitive pressure. OpenAI is planning to enter the hardware space under the leadership of former Apple design chief Jony Ive, potentially threatening Apple's ecosystem
2
. The departure of Apple's AI chief John Giannandrea further signals the company's strategic pivot1
. By leaning on Gemini's capabilities while reshaping Siri's interface and behavior, Apple appears to be taking a more pragmatic approach to competing in the fast-moving AI landscape without building everything from scratch3
.Related Stories
As Apple develops this AI chatbot, the company faces a critical challenge regarding how much the assistant can remember about users. Apple is reportedly planning to sharply limit this capability in the interest of privacy, distinguishing its approach from rivals like ChatGPT and Gemini, which retain extensive memory of past interactions to provide context in future conversations
5
. This privacy-first stance aligns with Apple's brand identity but could impact the assistant's effectiveness compared to competitors.The chatbot version of Siri is expected to have capabilities that significantly surpass the AI personalization features planned for earlier releases
4
. It will reportedly be able to search the web for information, summarize content, generate images, create written material, and analyze uploaded files5
. Apple plans to announce Campos at WWDC in June before launching it in September as the primary new addition to iOS 27, with other updates focused mostly on stability and bug fixes4
. This marks Apple's most aggressive push yet into generative AI and an acknowledgment that incremental updates are no longer sufficient in an era defined by conversational AI3
.Summarized by
Navi
[2]
12 Jan 2026β’Technology

10 Aug 2025β’Technology

11 Jun 2025β’Technology

1
Policy and Regulation

2
Technology

3
Technology
