NEWS 10 min read

Apple Intelligence in 2026: What's Actually Shipping and What's Still Vaporware

Apple promised AI would transform the iPhone. A year and a half later, some features are genuinely great and others are embarrassingly behind. Here's the real scorecard.

By EgoistAI ·
Apple Intelligence in 2026: What's Actually Shipping and What's Still Vaporware

When Apple announced Apple Intelligence at WWDC 2024, the presentation was vintage Apple: beautifully produced, deliberately paced, and carefully designed to make you feel like the future had arrived. Siri would finally understand context. Your iPhone would summarize everything. Writing tools would polish your prose. And all of it would happen on-device, privately, because Apple cares about your data. Standing ovation. Keynote over.

Then reality happened.

The initial rollout was staggered, delayed, and geographically restricted. Features trickled out across iOS 18.1, 18.2, 18.3, and 18.4. Some worked well. Some were half-baked. And Siri — the centerpiece of the whole pitch — remained stubbornly, infuriatingly mediocre for months after launch.

Now, in March 2026, with iOS 19 on the horizon and nearly two years of iteration, it’s time for an honest accounting. What has Apple Intelligence actually delivered? What’s still missing? And does Apple’s AI strategy make sense in a world where the competition isn’t standing still?

What Has Apple Intelligence Actually Delivered?

Let’s start with what works, because some of it genuinely works.

Writing Tools: The Quiet Win

Apple’s system-wide writing tools — available in Mail, Notes, Messages, and any text field — are the most consistently useful Apple Intelligence feature. Proofread catches real errors without being annoying. The tone adjustment (Professional, Friendly, Concise) genuinely reshapes text rather than just swapping a few words. And the summarize feature for long emails and articles is fast, accurate, and available everywhere.

This isn’t flashy. Nobody’s making TikToks about proofreading. But it’s the kind of ambient utility that justifies Apple’s approach to AI: deeply integrated, always available, and useful without requiring you to change how you work. If every Apple Intelligence feature worked this well, there’d be nothing to criticize.

Notification Summaries: Useful but Imperfect

The notification summary feature — which condenses multiple notifications from the same app into a brief summary — ranges from genuinely helpful to comically wrong. It works well for email and news apps where summarizing content is straightforward. It struggles with Messages, where context and tone matter, and has produced some viral blunders where critical information was summarized away.

Apple has iteratively improved the accuracy since launch, and the current state is solid for most use cases. The key is that it fails gracefully — you can always tap to see the original notifications. It’s a convenience feature, not a critical one, and that calibration is appropriate.

Image Generation and Genmoji: Fun, Not Revolutionary

Image Playground and Genmoji — Apple’s on-device image generation features — deliver on their limited promise. They produce cute, stylized images and custom emoji that are fun for Messages and social sharing. The quality is deliberately constrained: Apple chose a cartoon/illustration style rather than photorealistic generation, which neatly sidesteps deepfake concerns and sets clear expectations.

The Image Playground integration into Messages, Notes, and other apps is smooth. The limitation is creative: the style is Apple-cute, and there’s no way to generate photorealistic images, detailed art, or anything edgy. This is by design, but it means serious creators won’t find much utility here.

Clean Up in Photos: Genuinely Impressive

The Clean Up tool in Photos — Apple’s version of Google’s Magic Eraser — works remarkably well. Point at an unwanted object in a photo, and it’s removed with impressive accuracy. The edge detection and inpainting quality are competitive with Google’s implementation, which has had a multi-year head start.

This is the kind of feature that demonstrates on-device AI at its best: instant, private, and useful in a way that doesn’t require any AI expertise to appreciate.

Where Has Apple Intelligence Fallen Short?

Siri: Better, But Still Behind

This is the big one. Siri was supposed to be the crown jewel of Apple Intelligence — a conversational AI assistant that understands context, maintains multi-turn conversations, takes actions across apps, and serves as the AI interface for the entire Apple ecosystem.

The reality: Siri has improved, but the gap between Apple’s AI assistant and ChatGPT, Claude, or Google Gemini remains wide. Siri can now handle some follow-up questions without losing context, can summarize on-screen content, and has a more natural conversational flow. But ask it anything that requires genuine reasoning, nuanced understanding, or complex multi-step planning, and the limitations become obvious.

Apple’s partnership with ChatGPT — where Siri can hand off complex queries to OpenAI’s model — is both a lifeline and an admission of the gap. When you ask Siri something it can’t handle and it says “would you like me to ask ChatGPT?” the subtext is clear: Apple’s own models aren’t there yet.

The on-screen awareness features, where Siri can see what’s on your screen and take actions based on it, have rolled out more slowly than promised. Some app integrations work well — adding a calendar event from an email, sending content from one app to another. But the vision of “Siri, book a restaurant for Friday based on the recommendation in this article” remains more demo than reality for most use cases.

App Intents and Third-Party Integration: The Missing Piece

Apple Intelligence’s biggest structural weakness is the App Intents ecosystem. For Siri to be truly useful, it needs to understand and control third-party apps — order food through DoorDash, send money through Venmo, control smart home devices through various apps. This requires developers to build App Intents into their apps.

Adoption has been slow. Major apps have implemented basic intents, but the depth of integration is shallow. You can ask Siri to open apps and perform simple actions, but the complex, multi-app workflows that Apple demoed at WWDC remain rare. The chicken-and-egg problem is real: developers won’t invest in App Intents until users demand Siri integration, and users won’t demand it until the integrations work well enough to be useful.

Private Cloud Compute: Technically Impressive, Practically Invisible

Apple’s Private Cloud Compute — the infrastructure that processes more complex AI tasks in the cloud while maintaining privacy through secure enclaves — is a genuine technical achievement. The security architecture, independently audited and verified, is impressive. Apple has delivered on its promise that cloud-processed queries are private and ephemeral.

The problem is that most users never notice. Private Cloud Compute is infrastructure, not a feature. It enables Apple Intelligence to handle more complex tasks than on-device processing alone would allow, but the end result is still the same features — writing tools, summaries, Siri responses — working slightly better than they would on-device only. The privacy benefit is real but invisible.

How Does Apple’s AI Strategy Compare to the Competition?

Apple’s fundamental bet is that AI should be invisible infrastructure, not a visible product. While Google, Microsoft, and OpenAI are building AI chatbots and agents that users interact with directly, Apple is embedding AI into existing workflows and interfaces. You don’t “use Apple Intelligence” — you use your iPhone, and Apple Intelligence makes it better.

This is a defensible strategy with a significant risk. The defense: most people don’t want another app to learn. They want their existing tools to work better. Apple Intelligence, when it works, delivers exactly this. The risk: if competitors’ AI agents become powerful enough to replace apps entirely, Apple’s app-centric model becomes a liability.

Google’s Gemini is already more capable than Siri as a conversational AI assistant, and it’s deeply integrated into Android. Google’s multimodal capabilities — understanding images, videos, and audio alongside text — are ahead of Apple’s. The gap is most visible in real-time assistance: Google’s AI can look through your camera and identify objects, provide live translation overlays, and answer questions about what it sees.

Microsoft’s Copilot strategy is less relevant for consumers but significant for enterprise. Apple has minimal presence in enterprise AI, and Microsoft’s integration of Copilot across Office 365 is creating an enterprise AI moat that Apple can’t easily cross.

OpenAI’s ChatGPT, through its Apple partnership, is simultaneously Apple’s secret weapon and its most dangerous dependency. The partnership gives Apple users access to frontier AI capabilities without Apple building them. But it also means that the most impressive AI feature on your iPhone was built by someone else, and Apple’s influence over its direction is limited.

What’s Coming at WWDC 2026?

Based on reporting from Bloomberg’s Mark Gurman and patterns in Apple’s developer documentation, WWDC 2026 (expected June 2026) will focus on three areas:

Siri with on-device large language model: Apple has been developing a more capable on-device model that should significantly improve Siri’s conversational and reasoning abilities. The goal is to reduce dependency on the ChatGPT fallback by making Siri’s native capabilities closer to competitive.

Enhanced App Intents with AI orchestration: A more powerful App Intents framework that lets AI coordinate across multiple apps without requiring as much developer effort. This would address the third-party integration gap.

Visual intelligence expansion: Deeper integration of the camera-based Visual Intelligence features introduced with iPhone 16, potentially including real-time translation, object identification, and augmented reality overlays powered by on-device AI.

The question is whether these updates close the gap or merely narrow it. Apple’s strategy only works if the features are good enough that users don’t feel the need to open ChatGPT or Google directly. Right now, that bar isn’t being met for complex tasks.

FAQ: Apple Intelligence in 2026

Which iPhones support Apple Intelligence?

iPhone 15 Pro, iPhone 15 Pro Max, and all iPhone 16 models and later. The requirement is the A17 Pro chip or newer, which provides the neural engine performance needed for on-device AI processing. Older iPhones — even the standard iPhone 15 — do not support Apple Intelligence.

Is Apple Intelligence available worldwide?

As of March 2026, Apple Intelligence is available in English (US, UK, Australia, Canada, and several other locales), with expanding support for French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Chinese. However, feature availability varies by language, with English receiving the most complete implementation.

Does Apple Intelligence send my data to the cloud?

Most Apple Intelligence features run entirely on-device. More complex tasks use Private Cloud Compute, which processes requests in secure enclaves that Apple cannot access. Requests forwarded to ChatGPT require explicit user permission each time. Apple’s privacy architecture for AI is the most robust in the industry.

Can I turn off Apple Intelligence?

Yes. Apple Intelligence can be disabled entirely in Settings > Apple Intelligence & Siri. Individual features (writing tools, notification summaries, etc.) can also be toggled independently.

Is Apple Intelligence better than Google Gemini on Android?

Google Gemini is more capable as a conversational AI and offers more advanced multimodal features. Apple Intelligence is more deeply integrated into the operating system and offers better privacy protections. The “better” answer depends on whether you prioritize AI capability or system integration and privacy.

The Bottom Line

Apple Intelligence is not the revolution Apple pitched. It’s an evolution — a set of useful but incremental improvements that make iPhones, iPads, and Macs marginally better at everyday tasks. The writing tools are great. The photo features are solid. Notification summaries are fine. And Siri is… better.

The honest assessment: if you’re an iPhone user, Apple Intelligence features are welcome additions that you’ll use daily without thinking much about them. If you’re evaluating whether Apple Intelligence is a reason to switch from Android, the answer is no — Google’s AI capabilities are more advanced, and the gap is widening, not closing.

Apple’s best argument is privacy, and it’s a real one. No other company has built an AI infrastructure with comparable privacy protections. Whether that matters enough to offset the capability gap is a personal judgment call.

The next 12 months will be decisive. WWDC 2026 needs to deliver a step-function improvement in Siri’s capabilities, not another set of incremental updates. The AI race waits for nobody, and Apple’s famously patient approach to product development may be too patient for a market moving this fast.

Share this article

> Want more like this?

Get the best AI insights delivered weekly.

> Related Articles

Tags

appleapple intelligencesiriioson-device ai

> Stay in the loop

Weekly AI tools & insights.