Siri, Redesigned: Apple Tests Multitasking, AI ‘Extensions’ and a Standalone App for iOS 27

What if you could tell Siri, in one sentence, to check the weather, add a calendar event and send a message — and it actually did all three without you repeating yourself?

That’s the kind of change Apple appears to be cooking up for iOS 27. Multiple reports, distilled from conversations with people familiar with Apple’s plans, paint a picture of Siri moving from a limited voice helper to a more capable, chat-like assistant with persistent context, multitasking skills and a new marketplace for third‑party AI agents.

One prompt, multiple tasks

Right now Siri mostly expects a single, discrete instruction at a time. The upgrade Apple is testing will let Siri parse and execute multiple requests delivered in a single prompt. Ask it to check the morning forecast, schedule a dentist appointment and shoot a quick message to a colleague — and it should handle them all without handing the baton back to you after each step.

Technically, that’s a big leap. It requires the assistant to maintain context across sub‑tasks, prioritize actions, and call different system services (Calendar, Messages, Weather) in sequence. Apple’s work on persistent context — first hinted at in iOS 18 with short-term memory between back‑to‑back requests — seems ready for a major extension.

A Siri app, plus “Extensions” as an AI marketplace

Siri won’t just be an overlay or a waveform at the bottom of the screen. Apple is reportedly building a standalone Siri app that behaves more like ChatGPT, Gemini or Claude: you’ll be able to interact by voice or text, scroll past conversations, and revisit previous exchanges. That alone changes Siri from an ephemeral helper to something you can treat like a chat history.

The other headline: “Extensions.” These are plug‑in style AI agents — third‑party chatbots you can install and run inside Siri. Apple plans a dedicated section in the App Store for these agents, effectively turning Extensions into a mini AI marketplace where developers can ship specialized assistants that work across the OS. If true, that’s both a convenience play for users and a potential revenue stream for Apple via App Store commissions.

Given Apple’s prior move to allow ChatGPT access from Siri in iOS 18.2, Extensions feels like a scaled and systemwide version of that concept.

Gemini, Apple models and the device question

Reports say Apple will lean on Google’s Gemini tech as well as its own on‑device AI to power the new capabilities. How those pieces fit together isn’t public, but expect a hybrid approach: heavy lifting in cloud models with local acceleration on newer chips for latency and privacy-sensitive tasks.

Not every iPhone may get the full upgrade. Early notes suggest some features could be limited to phones with Apple Intelligence support — likely newer silicon such as the iPhone 15 Pro family and onward. Apple will have to balance ambition with performance and battery life across a wide install base.

A redesigned interface and system hooks

Siri’s interface is getting attention too. Apple engineers are exploring a redesigned visual treatment — possibly leaning on Dynamic Island on supported devices — plus new buttons inside apps: a systemwide “Ask Siri” and a “Write with Siri” bar above the keyboard. Those little friction removals could make the assistant feel less like a novelty and more like a natural input method.

If that sounds familiar, it’s because OS makers have been quietly scattering AI entry points across their platforms. The difference here: Apple is trying to centralize the experience into one app while keeping tight integration with the rest of the system.

Why this matters (and what could go wrong)

Siri has long been seen as lagging behind competitors. Making it competent at multitasking and adding a real chat app would change the perception — potentially keeping users from hunting for alternatives on other platforms. An App Store for AI agents gives Apple both control and a new commercial lever.

But there are obvious challenges. Third‑party agents raise moderation and privacy questions. Apple’s App Store rules and the technical sandboxing required to let an external agent take actions (like sending messages or creating calendar events) will need careful design. There’s also the business angle: an “Extensions” marketplace implies commissions on agent sales or subscriptions, something Apple’s ecosystem has struggled to monetize without headaches.

Timeline and what to expect at WWDC

Apple is expected to preview iOS 27 at WWDC on June 8, where the company typically outlines major software directions. A developer beta should arrive soon after, with a public release in the fall if Apple follows its usual cadence. If you’ve followed the incremental tweaks in recent releases, this is a distinctly bigger step: compare the tinkering in the latest test builds to the kind of rework Apple did when it introduced the App Store itself or rethought notifications.

If you’ve been tracking the OS updates, the current iOS 26.5 developer beta notably contained no big Siri changes, even as other services and APIs shifted; that makes the iOS 27 window the likeliest home for these features. See our earlier coverage of iOS 26.5 developer beta for the smaller tweaks Apple pushed out this cycle, and how the company has been laying groundwork across releases like iOS 26.4 to smooth system-level changes.

Expect a demo at WWDC that mixes show‑and‑tell: real world examples of chained commands, the new Siri app, and a peek at Extensions in the App Store. Whether those glimpses become polished, widely available features is another question — Apple tends to show grand visions and then refine them over months.

Siri’s makeover has been a long time coming. If Apple gets it right — a smart, context‑aware assistant that can safely call on trusted third‑party agents — your next voice command might do far more than you ask for. Or, at least, you’ll be able to ask for more in one breath and not be left hanging.

SiriiOS 27AIApple IntelligenceWWDC