iOS 26.4 Release Candidate brings Gemini-powered Siri and new Apple Intelligence features

    Apple has pushed out the iOS 26.4 Release Candidate, and this one is a bigger deal than a typical point release. The update brings Google's Gemini AI model directly into Siri, marking a notable shift in how Apple is handling its AI ambitions. Rather than relying entirely on its own models, Apple has opted to integrate Gemini as the engine behind Siri's more capable responses. The rollout to general users is expected between March 23 and March 25.

    What Gemini actually does inside Siri

    The integration is not a surface-level feature toggle. Gemini handles complex queries that Siri previously could not process reliably, including multi-step reasoning, longer context retention across a conversation, and richer natural language responses. Apple is routing certain query types to Gemini while keeping on-device processing for anything that involves personal data like messages, calendar events, or health information. This split approach is deliberate. Apple's pitch has always been that private data stays on the device, and that boundary appears to hold even with Gemini in the mix.

    For users, the experience is supposed to feel like a single assistant. You ask Siri something, and whether the answer comes from Apple's on-device model or Gemini running in the cloud, the response appears in the same interface. There is no mode switch or separate app. Apple has framed this as an extension of Apple Intelligence, the broader AI strategy it introduced with iOS 18.

    iOS 26.4 brings Gemini AI into Siri's core experience
    iOS 26.4 brings Gemini AI into Siri's core experience

    Other Apple Intelligence updates in this release

    Beyond the Gemini integration, iOS 26.4 RC includes several updates under the Apple Intelligence umbrella. Writing tools get a refinement pass, with better tone adjustment suggestions when composing messages or emails. Image generation through the Image Playground feature is reportedly faster on recent hardware. Notification summaries, which Apple had to quietly dial back in earlier versions after some factual errors, have also been updated with what Apple describes as improved accuracy.

    Siri's redesigned interface, which Apple started rolling out with iOS 18.1, gets further polish in this release. The conversational memory within a session now carries more context, so you can refer back to something you asked two or three exchanges ago without Siri losing the thread. That alone fixes one of the more frustrating day-to-day problems with the previous version.

    Release timing and who gets it first

    The Release Candidate build is what Apple distributes to developers and public beta testers before a public release. If no critical bugs surface, this build typically becomes the final public release without changes. The March 23 to March 25 window is tight, and Apple has not officially confirmed a specific date within that range. Devices compatible with Apple Intelligence, which means iPhone 15 Pro and later, will get the full feature set. Older devices will receive the update but without the AI-specific additions.

    WWDC 2026 announcement expected this week

    Separately, Apple is expected to announce the dates for WWDC 2026 sometime this week. The annual developer conference is anticipated for June, consistent with Apple's pattern over the past several years. WWDC is typically where Apple previews the next major iOS version, so an iOS 27 preview is likely on the agenda. Given how much Apple has invested in the Gemini partnership and Apple Intelligence this cycle, the WWDC keynote will probably spend considerable time on where that goes next.

    Why the Apple-Google AI deal is worth paying attention to

    Apple and Google are competitors in almost every meaningful sense. They run competing mobile platforms, competing browsers, and competing app ecosystems. The fact that Apple chose to integrate Gemini into Siri rather than push harder on its own large language model development says something about the state of Apple's AI capabilities relative to where Google and OpenAI are. Apple had a similar arrangement with ChatGPT, allowing users to opt into OpenAI-powered responses through Siri. Gemini appears to be a deeper integration than that earlier deal.

    From a user standpoint, the more immediate question is whether Siri with Gemini actually works better in practice. Early impressions from beta testers suggest the complex query handling is genuinely improved. Whether that holds up across a broader range of real-world questions is something that will become clearer once the public release lands later this week.

    Love this story? Explore more trending news on apple

    Share this story

    Frequently Asked Questions

    Q: Which iPhone models will get the Gemini-powered Siri features in iOS 26.4?

    The full Apple Intelligence feature set, including the Gemini integration, is limited to iPhone 15 Pro and later. Older devices can install iOS 26.4 but will not receive the AI-specific additions.

    Q: Does Gemini inside Siri have access to your personal data like messages or health information?

    Apple has stated that personal data such as messages, calendar events, and health records is processed on-device and is not sent to Gemini. Only certain query types are routed to Gemini's cloud infrastructure.

    Q: Is iOS 26.4 different from a standard point release?

    Yes. While point releases typically address bugs or minor improvements, iOS 26.4 includes a significant AI architecture change by integrating Google's Gemini model into Siri alongside several Apple Intelligence updates.

    Q: When is WWDC 2026 expected to take place?

    Apple is anticipated to announce WWDC 2026 dates during the week of March 23, with the event itself expected sometime in June 2026.

    Q: How is the Gemini integration in Siri different from the earlier ChatGPT partnership?

    The ChatGPT arrangement allowed users to optionally hand off queries to OpenAI's model. The Gemini integration appears to be more deeply embedded into Siri's core response handling, rather than an opt-in extension.

    Read More