Apple plans to revamp Siri later this year by turning the digital assistant into the company's first artificial intelligence chatbot, a move that would position the iPhone maker more directly in a generative AI race currently dominated by OpenAI and Google. The overhaul, reportedly code-named "Campos," follows years in which Apple downplayed the need for a dedicated conversational interface on its devices.
The new chatbot will be embedded deeply into iOS 27 and iPadOS 27, both internally known as "Rave," as well as macOS 27, code-named "Fizz." According to Bloomberg, the update will replace the current Siri interface across Apple's platforms. While users will still summon the assistant using familiar methods like voice commands or the side button, the interaction will evolve into a fluid, back-and-forth conversation similar to ChatGPT and Google's Gemini, with both voice- and typing-based input supported. The change follows years of criticism that Siri lagged behind rivals in conversational ability, while still keeping the experience integrated into the operating system rather than offered as a standalone app.
Under the hood, the new system relies heavily on the multi-billion dollar partnership with Google that Apple finalized earlier this week. While the user interface will remain distinctly Apple-designed, the underlying intelligence will be powered by a custom model developed by the Google Gemini team, known internally as Apple Foundation Models version 11. This arrangement allows Apple to deploy a high-end model comparable to Gemini 3. By contrast, the upcoming iOS 26.4 Siri update will rely on Apple Foundation Models version 10, which operates at roughly 1.2 trillion parameters and runs on Apple's own infrastructure. In a notable departure from its traditional on-device emphasis, Apple and Google are reportedly discussing hosting the chatbot directly on Google's servers running tensor processing units (TPUs), rather than relying solely on Apple's own Private Cloud Compute infrastructure. Bloomberg reports Apple is paying Google roughly $1 billion annually for access to the models.
The upgraded Siri is designed to do more than just chat. It will be able to analyze open windows and on-screen content to perform specific actions across the system. The report describes deep integration with core apps like Mail, Music, Photos, Podcasts, TV, and Xcode, as well as system features such as phone calls, timers, and camera controls. For example, a user could ask Siri to find a specific photo based on a visual description and then apply edits, or draft an email based on upcoming calendar events. These expanded capabilities could eventually allow the chatbot to replace Spotlight as the primary way users search for content and navigate their devices. The update will also include a feature dubbed "World Knowledge Answers," offering web-summarized responses with citations, a capability Apple has been developing since at least last year and plans to include in both upcoming Siri releases.
Before the chatbot arrives in the fall, Apple still intends to ship a separate Siri update this spring with iOS 26.4. That version will introduce features promised last year, such as on-screen awareness and better handling of personal context, but it will retain the current non-conversational interface. Internally, Apple has also been testing the chatbot technology as a standalone Siri app similar to ChatGPT and Gemini, though the company does not plan to release that version publicly. The full chatbot transformation is targeted for an unveiling at the Worldwide Developers Conference in June, with a public release slated for September.
The move comes as conversational AI tools have become more central to modern operating systems. Apple executives had previously argued against sending users into a dedicated chat experience, preferring to weave AI invisibly into the system. However, the rapid growth of ChatGPT, which has surpassed 800 million weekly users, has increased pressure on Apple to respond. OpenAI has also emerged as a more direct rival, recruiting engineering talent from Cupertino and developing new hardware with former design chief Jony Ive. Analysts have said Apple's reliance on Google was driven by urgent necessity to deliver a capable AI product after internal efforts fell behind.
Apple has been laying the groundwork for this transition for some time. Reports from last year revealed the company had built an internal ChatGPT-like app codenamed Veritas to test new language models. The push is now being led by software engineering chief Craig Federighi, who consolidated control over the company's AI strategy following the departure of AI chief John Giannandrea.
Despite its reliance on Google's technology, Apple is designing the system to remain flexible. The architecture allows underlying models to be swapped out over time, reducing long-term dependence on a single provider. The company has also reportedly tested Chinese AI models as part of efforts to eventually bring the chatbot to China, where Apple Intelligence is not currently available. One major issue under discussion is how much "memory" the assistant should retain about user interactions, a feature common among rival systems but one that conflicts with Apple's long-standing focus on data minimization.