Apple partners with Google Gemini to power the new AI Siri in iOS 26.4

    Apple and Google have been competing in mobile for nearly two decades. So when Apple officially confirmed it is using Google's Gemini model to power a rebuilt version of Siri, it was not a quiet announcement. The new Siri is scheduled to arrive with iOS 26.4, which Apple expects to release in March 2026. It is being described as a complete rebuild, not an incremental update.

    Siri has had a rough few years. While ChatGPT and Google Assistant were improving at a visible pace, Siri was still struggling with basic multi-step requests and frequently misunderstanding context. Apple knew it had a problem. The partnership with Google is the most direct acknowledgment yet that building a competitive large language model from scratch, fast enough to matter, was not the path Apple chose.

    Apple's AI-powered Siri backed by Google Gemini arrives in iOS 26.4
    Apple's AI-powered Siri backed by Google Gemini arrives in iOS 26.4

    The model behind the new Siri

    Apple is running the new Siri on Google's Gemini model, which has 1.2 trillion parameters. For context, larger parameter counts generally allow a model to handle more complex reasoning and hold more context across a long conversation. Whether that theoretical capacity translates into a noticeably better daily assistant depends on how Apple has tuned the integration, but the raw model is among the most capable available outside of OpenAI's GPT-4 class.

    Apple is not sending your queries directly to Google's servers in a way that Google can read or store. The processing runs through Apple's Private Cloud Compute infrastructure. Apple introduced Private Cloud Compute in 2024 as a way to handle AI requests that are too complex for on-device processing while still keeping the data isolated from the cloud provider's own systems. In practice, that means Google supplies the model weights, but Apple controls the execution environment and claims that query data does not leave Apple's infrastructure in a readable form.

    On-screen awareness and cross-app integration

    Two specific capabilities are getting attention in early coverage of the new Siri. The first is on-screen awareness. The assistant will be able to read what is currently displayed on your screen and act on it. If you are looking at a restaurant's website and ask Siri to make a reservation, it should be able to identify the relevant details on screen rather than requiring you to repeat them. That sounds basic, but it is something the previous Siri consistently failed at.

    The second is cross-app integration. Siri will be able to pull information across multiple apps to complete a task. Ask it to find the email from your dentist and add the appointment to your calendar, and it should handle both steps without dropping context between them. Apple demoed similar functionality at WWDC 2024 under the Apple Intelligence banner, but user reports suggested the rollout was limited and inconsistent. The Gemini-backed version is positioned as the more complete implementation.

    Why Google, and what Google gets from this

    Apple already has a deal with OpenAI that brought ChatGPT into Siri as an optional extension. The Gemini partnership appears to be a deeper integration at the core model level rather than a bolt-on option. Apple likely evaluated multiple models and chose Gemini based on performance benchmarks, the terms of the Private Cloud Compute arrangement, and possibly pricing. Google has not disclosed the financial terms.

    For Google, having Gemini power Siri on hundreds of millions of iPhones is a significant distribution win. Even if iPhone users never interact with the Google brand directly through this integration, Gemini is doing the inference work on one of the world's largest installed bases of mobile devices. That drives usage volume, which feeds back into model improvement through real-world data patterns, assuming Apple's privacy architecture allows any of that signal to flow back to Google, which remains unclear from public disclosures.

    Privacy claims and what they actually mean

    Apple has been specific about the privacy architecture in its communications. Private Cloud Compute is designed so that the cloud nodes handling inference cannot retain data between requests, cannot be accessed by Apple employees, and are subject to independent audits. Apple published the security review framework for Private Cloud Compute in late 2024. The claim is that even Apple cannot read what you asked Siri.

    Whether users actually trust that claim is a different question. Routing queries through a system that ultimately depends on Google's model, even if the execution is isolated, will make some users uncomfortable. Apple is betting that its brand reputation for privacy, combined with the published technical architecture, is enough to keep users from opting out or switching platforms. Given that iPhone users have historically shown high tolerance for Apple's data handling compared to Google's, that bet is probably reasonable.

    What changes for users on iOS 26.4

    For the average iPhone user, the most noticeable difference will be in how Siri handles requests that previously required several steps or failed entirely. Tasks like summarizing a long email thread, drafting a reply based on context from a previous message, or completing a purchase flow inside a third-party app are the scenarios Apple has focused on in internal testing communications.

    The update will require iOS 26.4, which Apple has not yet released to the public. A developer beta was made available in February 2026. The full public release is targeted for March 2026, though Apple has not committed to a specific date within that month. Older devices that cannot run iOS 26 will not receive the updated Siri regardless of the software update.

    Love this story? Explore more trending news on apple

    Share this story

    Frequently Asked Questions

    Q: Will the Google Gemini-powered Siri work on older iPhones?

    The updated Siri requires iOS 26.4, which is only available on devices compatible with iOS 26. iPhones that cannot run iOS 26 will not receive the new Siri functionality.

    Q: Does Apple share your Siri queries with Google under this arrangement?

    Apple says queries are processed through its Private Cloud Compute infrastructure, which is designed to prevent Google from accessing or storing the data. Apple published its Private Cloud Compute security framework in 2024 for independent review, though the specific data flow terms of the Google deal have not been fully disclosed.

    Q: How is the Gemini integration different from the existing ChatGPT option in Siri?

    The ChatGPT integration in current versions of Siri is an optional extension users can invoke for specific queries. The Gemini integration in iOS 26.4 is reported to be a core model powering Siri's main intelligence layer, not an optional add-on.

    Q: When exactly will iOS 26.4 be released to the public?

    Apple has targeted March 2026 for the public release of iOS 26.4. A developer beta was made available in February 2026. Apple has not confirmed a specific date within March.

    Q: What does on-screen awareness actually let Siri do?

    On-screen awareness allows Siri to read the content currently displayed on your screen and use that information when responding to a request. For example, if you are viewing a business's contact page, Siri can reference those details without you having to repeat them manually.

    Read More