Apple Intelligence Features: What Your iPhone Can Do With On-Device AI

Apple Intelligence features represent Apple’s integrated AI system built directly into iOS, iPadOS, and macOS. Unlike cloud-based AI services, Apple Intelligence runs most processing on your device using the Neural Engine, keeping your personal data private while providing writing assistance, image generation, notification summaries, and a significantly upgraded Siri. The system launched with iOS 18.1 and has expanded through subsequent updates.

Apple Intelligence is not a single feature but a framework that touches nearly every built-in app on your iPhone. It rewrites your emails, generates custom emoji from text descriptions, summarizes long notification threads, transcribes phone calls, removes unwanted objects from photos, and turns Siri from a basic voice assistant into a context-aware AI that understands what is on your screen. This guide covers every major capability and how to use each one.

Which iPhones Support Apple Intelligence

Apple Intelligence requires an iPhone 15 Pro, iPhone 15 Pro Max, or any iPhone 16 model. The hardware requirement exists because these devices contain the A17 Pro or A18 chips with Neural Engines powerful enough to run the on-device language models. No older iPhone models will receive Apple Intelligence, regardless of iOS version.

To enable Apple Intelligence, go to Settings, then Apple Intelligence & Siri, and toggle on Apple Intelligence. You may need to join a waitlist initially, though most regions now have immediate access. The initial setup downloads several hundred megabytes of AI models to your device, after which features work offline for most tasks.

Writing Tools: AI Editing Across Every App

Writing Tools is the most immediately useful Apple Intelligence feature. Select any text you have written in Mail, Notes, Messages, Pages, or any third-party app that uses standard text fields, and tap Writing Tools in the popup menu. You get four core options: Proofread fixes grammar and spelling. Rewrite generates an alternative version of your text. Friendly adjusts the tone to be warmer and more casual. Professional makes the text more formal and business-appropriate. Concise shortens the text while preserving the key message.

Proofread is particularly powerful because it does not just flag errors but shows you exactly what changed and why. It catches subject-verb agreement issues, inconsistent tense usage, and awkward phrasing that basic spell-checkers miss entirely. The rewrite function generates a complete alternative draft rather than suggesting individual word changes, which is useful when you want a fresh perspective on how to express an idea.

Notification Summaries

When you receive multiple messages in a group chat, a long email thread, or a burst of social media notifications, Apple Intelligence condenses them into a brief summary that appears on your Lock Screen. Instead of seeing ten individual message previews, you see a single line like “Group discussing dinner plans for Saturday. Maria suggested Italian, three people agreed.”

Notification summaries work automatically for Messages, Mail, and most third-party apps. You can control which apps use summarization in Settings, then Notifications, then Summarize Notifications. Some users prefer to disable summaries for certain apps where they want to see every individual notification rather than a condensed version.

Image Generation With Image Playground and Genmoji

Image Playground lets you create AI-generated images from text descriptions directly on your iPhone. Open the Image Playground app or tap the Image Playground button in Messages, Notes, and other supported apps. Type a description like “golden retriever wearing sunglasses at the beach” and the AI generates an original image in seconds. You can choose between animation, illustration, and sketch styles.

Genmoji extends this to emoji creation. In any text field, open the emoji keyboard and tap the Genmoji button. Describe the emoji you want, and Apple Intelligence creates a custom emoji matching your description. You can also create Genmoji based on photos of people in your Photos library, generating personalized emoji that look like your friends and family.

Clean Up: Remove Objects From Photos

The Clean Up tool in Photos uses AI to identify and remove unwanted objects, people, or distractions from your images. Open a photo in the Photos app, tap Edit, and select Clean Up. The AI automatically highlights objects it can remove (shown with a subtle glow). Tap on any object or brush over it to remove it.

Clean Up intelligently fills in the background behind removed objects using context from surrounding pixels. It works best for removing photobombers from scenic shots, erasing power lines from landscape photos, and cleaning up clutter from product photos. The results are remarkably clean for an on-device tool, though complex removals against detailed backgrounds may show artifacts.

Siri With Apple Intelligence

Apple Intelligence transforms Siri from a command-based assistant into a conversational AI that understands context. The upgraded Siri can see what is on your screen and act on it. If you are looking at a restaurant in Safari, you can say “Add this to my calendar for Friday” and Siri understands that “this” refers to the restaurant and creates a calendar event without you specifying the name or address.

Siri now also understands typed requests. Tap the Siri icon or double-tap the bottom of the screen to type to Siri instead of speaking. This is useful in quiet environments or when you need to make a complex request that is easier to type than say. Siri maintains conversation context across multiple exchanges, so you can ask follow-up questions without repeating background information.

Call Recording and Transcription

Starting with iOS 18.1, Apple Intelligence enables native call recording on supported iPhones. During a phone call, tap the Record button in the call interface. Both parties hear an automated announcement that the call is being recorded. After the call ends, the recording appears in the Notes app with a complete AI-generated transcript alongside the audio.

The transcription is remarkably accurate for clear phone calls and identifies different speakers in the conversation. You can search through transcripts by keyword, making it easy to find specific information discussed during long calls. The recording and transcript stay on your device unless you explicitly share or back them up.

Priority Notifications and Mail Sorting

Apple Intelligence analyzes your incoming emails and prioritizes them based on urgency and content. Time-sensitive messages, such as boarding passes, delivery confirmations, and meeting changes, appear at the top of your inbox with a highlighted badge. The AI learns your email patterns over time and improves its prioritization based on which messages you open first and which you ignore.

In the Mail app, you also see AI-generated previews that go beyond the first line of the email. Instead of showing the greeting (“Hi, hope you’re doing well…”), the preview displays the actual substance of the message (“Requesting project deadline extension to March 15”).

Frequently Asked Questions

Does Apple Intelligence send my data to Apple’s servers?

Most Apple Intelligence processing happens on-device using the Neural Engine. For complex requests that exceed on-device capability, Apple uses Private Cloud Compute, which runs on dedicated Apple Silicon servers. Your data is never stored on servers, never accessible to Apple employees, and is deleted after processing. Independent auditors can verify the security of Private Cloud Compute.

Can I turn off specific Apple Intelligence features?

Yes. Go to Settings, then Apple Intelligence & Siri to toggle the entire system on or off. Individual features like notification summaries, writing tools availability, and Siri behavior can be controlled through their respective settings sections.

Why is Apple Intelligence not available on my iPhone 15?

Apple Intelligence requires the A17 Pro chip found only in the iPhone 15 Pro and Pro Max models. The standard iPhone 15 and iPhone 15 Plus use the A16 chip, which lacks the Neural Engine throughput needed for on-device AI model execution. All iPhone 16 models include a sufficiently powerful chip.

Does Apple Intelligence work offline?

Most features work offline after the initial AI model download. Writing tools, notification summaries, photo editing, and basic Siri commands run entirely on-device. Image generation and complex Siri requests that require Private Cloud Compute need an internet connection.

Avatar photo
Chris Rossiter

Darrell is a blogger who likes to keep up with the latest from the tech and finance world. He is a headphone and mobile reviewer and one of the original baker's dozen editorial staff that founded the site. He is into photography, VR, AR, crypto, video games, science and other neat things.

Articles: 3023