Back to all posts

Unpacking Apple Intelligence AI Features

2025-06-11Brian Heater, Amanda Silberling9 minutes read
Apple AI
Generative AI
Siri

If you have recently upgraded to a newer iPhone model, you have probably noticed Apple Intelligence making its appearance in some of your most frequently used apps, such as Messages, Mail, and Notes. Apple Intelligence, also conveniently abbreviated to AI, emerged in Apple’s ecosystem in October 2024. It is positioned to be a lasting feature as Apple vies with competitors like Google, OpenAI, Anthropic, and others in the race to develop superior AI tools.

Understanding Apple Intelligence

Apple Intelligence AI for the rest of us - Image Credits: Apple

Cupertino's marketing team has branded Apple Intelligence as “AI for the rest of us.” This platform is engineered to enhance existing features by leveraging capabilities already well-established in generative AI, such as text and image generation. Similar to other platforms like ChatGPT and Google Gemini, Apple Intelligence was trained on extensive information models. These systems employ deep learning to create connections across various media, including text, images, video, or music.

The text generation feature, powered by a Large Language Model (LLM), is presented as Writing Tools. This functionality is available across numerous Apple applications, including Mail, Messages, Pages, and Notifications. It can be utilized to summarize lengthy texts, proofread documents, and even compose messages for you based on content and tone prompts.

Image generation has also been integrated, though perhaps a bit less seamlessly. Users can prompt Apple Intelligence to create custom emojis, known as Genmojis, in Apple's distinct style. Meanwhile, Image Playground is a standalone image generation application that uses prompts to produce visual content suitable for Messages, Keynote, or sharing on social media.

The Evolution of Siri and New AI Capabilities

Apple Intelligence also brings a long-awaited update to Siri. Although an early entrant in the smart assistant field, Siri had been largely overlooked in recent years. The new Siri is much more deeply integrated into Apple’s operating systems. For instance, instead of the familiar icon, users will now see a glowing light around the edge of their iPhone screen when Siri is active.

More significantly, the revamped Siri now functions across different apps. This means, for example, you can ask Siri to edit a photo and then directly insert it into a text message, providing a frictionless experience previously lacking. Onscreen awareness allows Siri to use the context of your current activity to deliver more relevant answers.

Watch: Apple Intelligence on iPhone in 5 minutes

Leading up to WWDC 2025, many anticipated an even more advanced version of Siri. However, it appears we will have to wait a bit longer for that.

“As we’ve shared, we’re continuing our work to deliver the features that make Siri even more personal,” stated Apple SVP of Software Engineering Craig Federighi at WWDC 2025. “This work needed more time to reach our high-quality bar, and we look forward to sharing more about it in the coming year.” This yet-to-be-released, more personalized Siri is intended to understand “personal context,” such as your relationships and communication routines. According to a Bloomberg report, the in-development version of this new Siri is reportedly too error-ridden to be released, explaining the delay.

At WWDC 2025, Apple also introduced a new AI feature called Visual Intelligence, which assists with image searches for things you encounter while browsing. Additionally, Apple unveiled a Live Translation feature capable of real-time conversation translation in Messages, FaceTime, and Phone apps.

Visual Intelligence and Live Translation are anticipated to become available later in 2025, with the launch of iOS 26.

The Journey of Apple Intelligence Unveiled

Following months of speculation, Apple Intelligence was a central focus at WWDC 2024. The platform's announcement came after a flurry of generative AI news from companies like Google and OpenAI, raising concerns that the typically secretive tech giant might have fallen behind on the latest technological trend, as detailed in coverage of WWDC 2024.

Contrary to such speculation, Apple had a team developing what turned out to be a distinctly Apple-like approach to artificial intelligence. While there was still flair during the demonstrations—Apple always enjoys a good presentation—Apple Intelligence is ultimately a very pragmatic take on the AI category.

Apple Intelligence is not a standalone feature. Instead, it is designed to integrate into existing offerings. Although it is a branding effort in a significant way, the large language model (LLM) driven technology will primarily operate behind the scenes. For consumers, the technology will mostly manifest as new features within existing apps.

More details were shared during Apple’s iPhone 16 event in September 2024. At this event, Apple highlighted several AI-powered features coming to its devices, including translation on the Apple Watch Series 10, visual search on iPhones, and various enhancements to Siri’s capabilities. The initial set of Apple Intelligence features is scheduled to arrive at the end of October, as part of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1.

These features launched first in U.S. English. Apple later added localizations for Australian, Canadian, New Zealand, South African, and U.K. English. Support for Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese is planned for 2025.

Who Can Access Apple Intelligence

iPhone 15 Pro Max showing Apple Intelligence compatibility - Image Credits: Darrell Etherington

The first wave of Apple Intelligence became available in October 2024 through iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 updates. These updates included integrated writing tools, image cleanup functionalities, article summaries, and a typing input for the redesigned Siri experience. A second wave of features was released as part of iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2. This list includes Genmoji, Image Playground, Visual Intelligence, Image Wand, and ChatGPT integration.

These offerings are free to use, provided you own one of the following compatible devices:

  • All iPhone 16 models
  • iPhone 15 Pro Max (A17 Pro)
  • iPhone 15 Pro (A17 Pro)
  • iPad Pro (M1 and later)
  • iPad Air (M1 and later)
  • iPad mini (A17 or later)
  • MacBook Air (M1 and later)
  • MacBook Pro (M1 and later)
  • iMac (M1 and later)
  • Mac mini (M1 and later)
  • Mac Studio (M1 Max and later)
  • Mac Pro (M2 Ultra)

Notably, only the Pro versions of the iPhone 15 have access, due to limitations in the standard model’s chipset. It is presumed, however, that the entire iPhone 16 line will be capable of running Apple Intelligence upon its release.

How Apple AI Works On Device and With the Cloud

Apple Intelligence Private Cloud Compute - Image Credits: Apple

When you pose a question to services like GPT or Gemini, your query is sent to external servers to generate a response, which necessitates an internet connection. However, Apple has adopted a small-model, bespoke approach to training its AI.

The primary advantage of this approach is that many tasks become far less resource-intensive and can be performed directly on the device. This is because, instead of relying on the comprehensive, all-encompassing approach that powers platforms like GPT and Gemini, Apple has compiled specific datasets in-house for particular tasks, such as composing an email.

This on-device processing does not apply to all functions, however. More complex queries will utilize the new Private Cloud Compute offering. Apple now operates remote servers running on Apple Silicon, which it claims allows for the same level of privacy as its consumer devices. Whether an action is performed locally or via the cloud will be transparent to the user, unless their device is offline, in which case remote queries will result in an error.

Apple Intelligence and Third Party App Integration

OpenAI and ChatGPT logos indicating partnership - Image Credits: Didem Mente/Anadolu Agency / Getty Images

Considerable attention was given to Apple’s pending partnership with OpenAI prior to the launch of Apple Intelligence. Ultimately, it transpired that the deal was less about powering Apple Intelligence itself and more about offering an alternative platform for tasks it is not primarily designed for. This is a tacit acknowledgment that building a small-model system has its limitations.

Apple Intelligence is free to use. Similarly, access to ChatGPT is also free. However, those with paid ChatGPT accounts will have access to premium features not available to free users, including unlimited queries.

ChatGPT integration, which debuts on iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2, serves two main roles: supplementing Siri’s knowledge base and expanding the existing Writing Tools options.

With the service enabled, certain questions will prompt the new Siri to ask for user approval before accessing ChatGPT. Examples of questions that might trigger this option include recipes and travel planning. Users can also directly instruct Siri to “ask ChatGPT.”

Compose is the other primary ChatGPT feature accessible through Apple Intelligence. Users can access it in any app that supports the new Writing Tools feature. Compose adds the ability to write content based on a prompt, complementing existing writing tools like Style and Summary.

It is known that Apple intends to partner with additional generative AI services. The company has strongly indicated that Google Gemini is next on that list.

Developer Opportunities with Apple AI Models

At WWDC 2025, as covered in live updates, Apple announced what it calls the Foundation Models framework. This framework will enable developers to tap into Apple's AI models while offline.

This development makes it more feasible for developers to build AI features into their third-party applications by leveraging Apple’s existing systems.

“For example, if you’re getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging,” Federighi stated at WWDC. “And because it happens using on-device models, this happens without cloud API costs […] We couldn’t be more excited about how developers can build on Apple intelligence to bring you new experiences that are smart, available when you’re offline, and that protect your privacy.”

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.