App Intents: From Siri to Interactive Snippets in iOS 26
Your app has features users love — but they have to open the app, navigate three screens deep, and tap the right button every single time. App Intents let the system surface those features through Siri, Spotlight, the Shortcuts app, Action Button, and — starting in iOS 26 — Interactive Snippets in Visual Intelligence. The framework turns your Swift code into actions the operating system understands natively.
This post covers the full App Intents stack: AppIntent, AppEntity, EntityQuery, AppShortcut, and
@AssistantSchema. We’ll also cover iOS 26’s new UndoableIntent, IntentValueQuery, and Interactive Snippets. We
won’t cover SiriKit’s legacy INIntent framework — that API is effectively deprecated in favor of App Intents. This
guide assumes familiarity with protocols and
async/await.
Note: App Intents requires iOS 16+.
@AssistantSchemaand Apple Intelligence integration require iOS 18+. Interactive Snippets require iOS 26. All code in this post uses Swift 6.2 strict concurrency.
Contents
- The Problem
- Building Your First AppIntent
- Modeling Data with AppEntity and EntityQuery
- AppShortcut: Zero-Tap Discovery
- AssistantSchema and Apple Intelligence
- iOS 26: Interactive Snippets and UndoableIntent
- Advanced Usage
- Performance Considerations
- When to Use (and When Not To)
- Summary
The Problem
Imagine you’re building a Pixar film tracker app. Users can mark films as watched, rate them, and browse by studio. Your most-used feature is “mark as watched” — users do it multiple times a week. But the flow looks like this: unlock phone, find app, open it, scroll to the film, tap the detail view, tap “Mark as Watched.” Six steps for a two-second action.
Without App Intents, the only way to expose this to the system is the legacy SiriKit INIntent framework — which
requires an Intents Definition file, a separate extension target, code generation, and supports only a fixed set of
Apple-defined domains. You cannot create arbitrary actions.
// The old way — limited to Apple's predefined domains
// Intents.intentdefinition → Xcode code generation → INExtension subclass
// You couldn't create "Mark Film as Watched" because there was no matching INIntent domain
App Intents replaces this with pure Swift code. No code generation, no separate targets, no domain restrictions. You define an action, and the system discovers it at build time through metadata extraction.
Building Your First AppIntent
Apple Docs:
AppIntent— App Intents
An AppIntent is a protocol with two requirements: a title and a perform() method. Here’s a minimal intent that
marks a Pixar film as watched:
import AppIntents
struct MarkFilmAsWatchedIntent: AppIntent {
static let title: LocalizedStringResource = "Mark Film as Watched"
static let description: IntentDescription = "Marks a Pixar film as watched in your collection."
@Parameter(title: "Film")
var film: PixarFilmEntity
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog {
let store = FilmStore.shared
try await store.markAsWatched(film.id)
return .result(dialog: "Done! \(film.title) is marked as watched.")
}
}
A few things to notice. The @Parameter property wrapper declares the intent’s input. Siri will ask the user “Which
film?” if they don’t specify one upfront. The perform() method returns an IntentResult — you compose result
capabilities using protocol composition (ProvidesDialog, ReturnsValue, OpensIntent). The system uses title and
description for display in Shortcuts and Spotlight.
Return Types and Result Composition
The IntentResult protocol is composable. You chain capabilities using &:
// Return a dialog and a value for downstream shortcuts
func perform() async throws -> some IntentResult & ProvidesDialog & ReturnsValue<Bool> {
let store = FilmStore.shared
let wasAlreadyWatched = store.isWatched(film.id)
try await store.markAsWatched(film.id)
return .result(
value: !wasAlreadyWatched,
dialog: wasAlreadyWatched
? "\(film.title) was already in your watched list."
: "Added \(film.title) to your watched list!"
)
}
Other composable protocols include ShowsSnippetView (render a SwiftUI view inline), OpensIntent (chain to another
intent), and ProvidesDialog (spoken/displayed text response).
Parameter Types
@Parameter supports standard types out of the box — String, Int, Double, Bool, Date, enums conforming to
AppEnum, and any AppEntity. For enums, conform to AppEnum to get automatic Siri disambiguation:
enum FilmGenre: String, AppEnum {
case animation
case liveAction
case documentary
static let typeDisplayRepresentation: TypeDisplayRepresentation = "Film Genre"
static let caseDisplayRepresentations: [FilmGenre: DisplayRepresentation] = [
.animation: "Animation",
.liveAction: "Live Action",
.documentary: "Documentary"
]
}
Modeling Data with AppEntity and EntityQuery
Apple Docs:
AppEntity— App Intents
When an intent needs to reference your app’s domain objects — a specific film, a character, a watchlist — you model them
as AppEntity conformances. This lets Siri, Shortcuts, and Spotlight understand and query your data.
struct PixarFilmEntity: AppEntity {
static let typeDisplayRepresentation: TypeDisplayRepresentation = "Pixar Film"
static let defaultQuery = PixarFilmQuery()
let id: String
let title: String
let releaseYear: Int
let director: String
var displayRepresentation: DisplayRepresentation {
DisplayRepresentation(
title: "\(title)",
subtitle: "\(director) (\(releaseYear))"
)
}
}
The defaultQuery property connects your entity to an EntityQuery — the bridge between your data layer and the
system’s search and disambiguation UI.
EntityQuery
An EntityQuery tells the system how to find entities. At minimum, you implement entities(for:) for ID-based lookup.
For Siri disambiguation and Spotlight integration, add suggestedEntities() and conform to EntityStringQuery for text
search:
struct PixarFilmQuery: EntityQuery {
@MainActor
func entities(for identifiers: [String]) async throws -> [PixarFilmEntity] {
let store = FilmStore.shared
return identifiers.compactMap { id in
guard let film = store.film(withID: id) else { return nil }
return PixarFilmEntity(
id: film.id,
title: film.title,
releaseYear: film.releaseYear,
director: film.director
)
}
}
@MainActor
func suggestedEntities() async throws -> [PixarFilmEntity] {
// Return recently accessed or popular films for quick selection
let store = FilmStore.shared
return store.recentFilms.prefix(10).map { film in
PixarFilmEntity(
id: film.id,
title: film.title,
releaseYear: film.releaseYear,
director: film.director
)
}
}
}
// For text search — enables "Hey Siri, mark Toy Story as watched"
extension PixarFilmQuery: EntityStringQuery {
@MainActor
func entities(matching query: String) async throws -> [PixarFilmEntity] {
let store = FilmStore.shared
return store.search(query).map { film in
PixarFilmEntity(
id: film.id,
title: film.title,
releaseYear: film.releaseYear,
director: film.director
)
}
}
}
When a user says “Mark Toy Story as watched,” Siri calls entities(matching: "Toy Story") and uses the results for
disambiguation. If only one entity matches, it proceeds automatically. If multiple match (e.g., “Toy Story,” “Toy Story
2,” “Toy Story 3”), Siri presents a disambiguation prompt.
EntityPropertyQuery for Filtered Lookups
For complex filtering — “Show all Pixar films directed by Pete Docter” — conform to EntityPropertyQuery and declare
queryable properties:
extension PixarFilmQuery: EntityPropertyQuery {
static let properties = QueryProperties {
Property(\PixarFilmEntity.$director) {
EqualToComparator { $0 }
ContainsComparator { $0 }
}
Property(\PixarFilmEntity.$releaseYear) {
EqualToComparator { $0 }
GreaterThanComparator { $0 }
LessThanComparator { $0 }
}
}
static let sortingOptions = SortingOptions {
SortableBy(\PixarFilmEntity.$releaseYear)
SortableBy(\PixarFilmEntity.$title)
}
@MainActor
func entities(
matching comparators: [EntityQueryComparator<PixarFilmEntity>],
mode: ComparatorMode,
sortedBy: [EntityQuerySort<PixarFilmEntity>],
limit: Int?
) async throws -> [PixarFilmEntity] {
// Apply comparators to your data store
// Simplified for clarity
let store = FilmStore.shared
return store.allFilms.map { film in
PixarFilmEntity(id: film.id, title: film.title,
releaseYear: film.releaseYear, director: film.director)
}
}
}
AppShortcut: Zero-Tap Discovery
Apple Docs:
AppShortcut— App Intents
AppShortcut makes your intents discoverable without the user configuring anything. These shortcuts appear in
Spotlight, the Shortcuts app, and Siri Suggestions automatically.
struct PixarFilmShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: MarkFilmAsWatchedIntent(),
phrases: [
"Mark \(\.$film) as watched in \(.applicationName)",
"I watched \(\.$film) in \(.applicationName)",
"Add \(\.$film) to my watched list in \(.applicationName)"
],
shortTitle: "Mark as Watched",
systemImageName: "checkmark.circle"
)
AppShortcut(
intent: GetUnwatchedFilmsIntent(),
phrases: [
"What haven't I watched in \(.applicationName)",
"Show unwatched films in \(.applicationName)"
],
shortTitle: "Unwatched Films",
systemImageName: "film.stack"
)
}
}
The phrases array defines the natural language patterns that trigger the shortcut. Use \(\.$parameterName) to
interpolate entity parameters into phrases — Siri maps the user’s spoken entity name to the parameter through your
EntityQuery. Always include \(.applicationName) in at least one phrase variant to help Siri route to the correct
app.
Warning: Apple limits you to a maximum of 10
AppShortcutinstances perAppShortcutsProvider. Choose your most impactful actions. Phrases must be unique across your app — duplicate phrases cause build-time warnings and runtime ambiguity.
AssistantSchema and Apple Intelligence
Apple Docs:
AssistantSchema— App Intents
Starting in iOS 18, Apple Intelligence understands your intents semantically through @AssistantSchema. Instead of
relying purely on phrase matching, the system maps your intent’s parameters and return type to a schema that the
on-device language model can reason about.
@AssistantIntent(schema: .photos.search)
struct SearchFilmsIntent: AppIntent {
static let title: LocalizedStringResource = "Search Films"
@Parameter(title: "Query")
var criteria: StringSearchCriteria
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<[PixarFilmEntity]> {
let store = FilmStore.shared
let results = store.search(criteria.term)
let entities = results.map { film in
PixarFilmEntity(id: film.id, title: film.title,
releaseYear: film.releaseYear, director: film.director)
}
return .result(value: entities)
}
}
The @AssistantIntent(schema:) macro maps your intent to one of Apple’s predefined semantic schemas — .photos.search,
.journal.createEntry, .system.search, and dozens more. The AI model uses these schemas to understand what your
intent does, not just how to invoke it. This means a user can say “Find me that Pixar movie about the fish” and Apple
Intelligence can route to your search intent even if that exact phrase isn’t in your AppShortcut definitions.
Tip: Browse the available schemas in Xcode’s documentation under
AssistantSchema. Choosing the right schema is critical — it determines how Apple Intelligence interprets your intent’s semantics.
iOS 26: Interactive Snippets and UndoableIntent
iOS 26 introduces two significant additions to the App Intents framework: Interactive Snippets for Visual Intelligence
and UndoableIntent for reversible actions.
Interactive Snippets
Interactive Snippets let your app render a live SwiftUI view inside Visual Intelligence results. When a user points their camera at a movie poster, Visual Intelligence can invoke your intent and display an interactive card — with buttons, ratings, and real-time data — directly in the camera overlay.
@available(iOS 26, *)
struct FilmInfoSnippetIntent: AppIntent {
static let title: LocalizedStringResource = "Show Film Info"
@Parameter(title: "Film")
var film: PixarFilmEntity
@MainActor
func perform() async throws -> some IntentResult & ShowsSnippetView {
let store = FilmStore.shared
guard let filmData = store.film(withID: film.id) else {
throw IntentError.entityNotFound
}
return .result {
FilmSnippetView(film: filmData)
}
}
}
@available(iOS 26, *)
struct FilmSnippetView: View {
let film: PixarFilm
var body: some View {
VStack(alignment: .leading, spacing: 8) {
Text(film.title)
.font(.headline)
HStack {
Label("\(film.releaseYear)", systemImage: "calendar")
Spacer()
Label(film.director, systemImage: "person")
}
.font(.subheadline)
.foregroundStyle(.secondary)
if film.isWatched {
Label("Watched", systemImage: "checkmark.circle.fill")
.foregroundStyle(.green)
} else {
Button("Mark as Watched") {
// Interactive — the button works inside the snippet
Task { try? await FilmStore.shared.markAsWatched(film.id) }
}
.buttonStyle(.borderedProminent)
}
}
.padding()
}
}
The ShowsSnippetView result type renders your SwiftUI view inline. The view is fully interactive — buttons, toggles,
and other controls work inside the snippet. This is a significant upgrade from the static text-only responses in earlier
iOS versions.
IntentValueQuery
IntentValueQuery is another iOS 26 addition that lets Visual Intelligence query your app for entity values without
displaying a full UI. This enables the system to pull data from your app as part of a larger intelligence workflow:
@available(iOS 26, *)
struct FilmRatingValueQuery: IntentValueQuery {
static let title: LocalizedStringResource = "Film Rating"
@Parameter(title: "Film")
var film: PixarFilmEntity
@MainActor
func perform() async throws -> Double {
let store = FilmStore.shared
guard let filmData = store.film(withID: film.id) else {
throw IntentError.entityNotFound
}
return filmData.rating
}
}
UndoableIntent
UndoableIntent adds a system-level undo capability to your intents. When a user performs an action through Siri or
Shortcuts, the system can offer to reverse it:
@available(iOS 26, *)
struct MarkFilmAsWatchedUndoableIntent: AppIntent, UndoableIntent {
static let title: LocalizedStringResource = "Mark Film as Watched"
@Parameter(title: "Film")
var film: PixarFilmEntity
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog {
let store = FilmStore.shared
try await store.markAsWatched(film.id)
return .result(dialog: "\(film.title) marked as watched.")
}
@MainActor
func undo() async throws -> some IntentResult & ProvidesDialog {
let store = FilmStore.shared
try await store.markAsUnwatched(film.id)
return .result(dialog: "Undone — \(film.title) is back on your unwatched list.")
}
}
The system surfaces the undo action in the confirmation UI. Siri might say “Done. Want to undo?” and the user can reverse the action with a single voice command.
Advanced Usage
Parameterized Shortcuts with IntentParameter
When your intent has optional parameters with sensible defaults, you can pre-populate them in AppShortcut definitions
to create specialized shortcuts:
struct BrowseFilmsIntent: AppIntent {
static let title: LocalizedStringResource = "Browse Films"
@Parameter(title: "Genre", default: .animation)
var genre: FilmGenre
@Parameter(title: "Minimum Rating", default: 4.0)
var minimumRating: Double
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<[PixarFilmEntity]> {
let store = FilmStore.shared
let films = store.films(genre: genre, minRating: minimumRating)
let entities = films.map { film in
PixarFilmEntity(id: film.id, title: film.title,
releaseYear: film.releaseYear, director: film.director)
}
return .result(value: entities)
}
}
Intent Discovery Through Spotlight and Donations
The system discovers your intents at build time through metadata extraction — no runtime registration needed. However, you can donate intent instances to improve Siri Suggestions ranking:
@MainActor
func userWatchedFilm(_ film: PixarFilm) async {
// Persist the watch action
try? await store.markAsWatched(film.id)
// Donate the intent so Siri learns user patterns
let entity = PixarFilmEntity(
id: film.id, title: film.title,
releaseYear: film.releaseYear, director: film.director
)
let intent = MarkFilmAsWatchedIntent()
intent.film = entity
try? await intent.donate()
}
Donations tell the system “the user just did this” so it can proactively suggest the action in Siri Suggestions, Spotlight, and the Lock Screen.
Error Handling
App Intents uses Swift’s native error handling. Throw a descriptive error from perform() and Siri displays it to the
user:
enum FilmIntentError: Error, CustomLocalizedStringResourceConvertible {
case filmNotFound(String)
case alreadyWatched(String)
case networkUnavailable
var localizedStringResource: LocalizedStringResource {
switch self {
case .filmNotFound(let title):
"Could not find a film named \"\(title)\" in your collection."
case .alreadyWatched(let title):
"\"\(title)\" is already in your watched list."
case .networkUnavailable:
"Unable to sync. Please check your connection and try again."
}
}
}
Conform your error to CustomLocalizedStringResourceConvertible so the system displays a meaningful message instead of
a generic failure.
Warning: Never throw errors with sensitive information (API keys, internal IDs, file paths). The error message is displayed directly to the user and may be read aloud by Siri.
Performance Considerations
App Intents metadata is extracted at build time by the appintentsmetadataprocessor tool. This means:
- No runtime reflection cost. The system reads a static
.appintentsmetadatafile, not your compiled code. Adding more intents does not slow down app launch. - Entity queries run on demand.
suggestedEntities()is called when the user invokes Siri or opens Shortcuts — not preemptively. Keep these queries fast (under 200ms) to avoid Siri timeout errors. - Snippet views are rendered out-of-process in iOS 26. Your
ShowsSnippetViewruns in an extension-like sandbox. Heavy computation or large image loading in the snippet view will cause the system to terminate your snippet with a blank card.
For entity queries backed by a database, use indexed columns and limit result counts. A suggestedEntities() call that
returns 1,000 results will timeout before Siri can display the disambiguation UI:
// Keep suggested entities fast and bounded
func suggestedEntities() async throws -> [PixarFilmEntity] {
let store = FilmStore.shared
// Cap at 10 — Siri's disambiguation UI can't meaningfully display more
return store.recentFilms.prefix(10).map { film in
PixarFilmEntity(id: film.id, title: film.title,
releaseYear: film.releaseYear, director: film.director)
}
}
Apple Docs:
AppIntentMetadataProcessor— App Intents
When to Use (and When Not To)
| Scenario | Recommendation |
|---|---|
| Feature users repeat frequently | High-value intent candidate. Saves taps. |
| Action that requires visual context (e.g., editing a photo) | Use OpensIntent to hand off to your app’s UI. |
| Background sync or data refresh | Use BackgroundTasks instead — intents are user-initiated. |
| Complex multi-step wizard | Intents work best for single actions. Chain intents in Shortcuts for multi-step flows. |
| Apple Intelligence integration (iOS 18+) | @AssistantSchema is required. Without it, your intent is invisible to the AI layer. |
| Visual Intelligence snippets (iOS 26+) | Use ShowsSnippetView for interactive cards. Keep views lightweight. |
| Action that should be reversible | Adopt UndoableIntent (iOS 26+) for a system-level undo prompt. |
Legacy SiriKit INIntent migration | Migrate to App Intents. Apple has deprecated INIntent for all new development. |
Summary
AppIntentis a pure-Swift protocol — define atitleand aperform()method, and the system discovers it at build time with zero runtime overhead.AppEntityandEntityQuerybridge your domain models to Siri’s disambiguation and Spotlight’s search, enabling natural language references like “Mark Toy Story as watched.”AppShortcutwithAppShortcutsProvidergives your intents zero-configuration discoverability in Siri, Spotlight, and the Shortcuts app.@AssistantSchema(iOS 18+) connects your intents to Apple Intelligence’s semantic understanding, enabling flexible natural language routing beyond exact phrase matching.- iOS 26 adds Interactive Snippets (
ShowsSnippetView) for live SwiftUI views in Visual Intelligence,UndoableIntentfor reversible actions, andIntentValueQueryfor data lookups without UI.
App Intents is the primary integration point for Apple Intelligence. If your app isn’t exposing intents, it’s invisible to the system’s AI layer. Next, explore Build an App with Siri for a hands-on tutorial building a complete Siri-integrated app, or dive into Foundation Models to see how on-device AI can power your intent responses.