Build an On-Device AI App with Foundation Models: Smart Assistant Without the Cloud
What if your app could brainstorm movie plots, flesh out character backstories, and suggest story arcs — all without a single network request leaving the device? With the Foundation Models framework introduced at WWDC 2025, Apple gave us exactly that: a large language model running entirely on-device, accessible through a Swift-native API that feels right at home next to SwiftUI and SwiftData.
In this tutorial, you’ll build Pixar Plot Assistant — an on-device AI app that helps writers brainstorm Pixar-style
movie plots, generate structured character descriptions, and suggest story arcs using tool-powered context. Along the
way, you’ll learn how to create a
LanguageModelSession, stream
responses token-by-token, extract structured data with
@Generable, and extend the model’s
capabilities with the Tool protocol.
Prerequisites
- Xcode 26+ with iOS 26 deployment target
- A device with Apple Silicon (the Foundation Models framework requires on-device Apple Intelligence; the Simulator supports it on Apple Silicon Macs)
- Familiarity with the Foundation Models framework
- Familiarity with @Generable structured output
- Familiarity with Tool calling in Foundation Models
Contents
- Getting Started
- Step 1: Defining the Data Models
- Step 2: Building the Chat Engine
- Step 3: Creating the Chat Interface
- Step 4: Streaming Responses Token-by-Token
- Step 5: Extracting Structured Character Profiles with @Generable
- Step 6: Building the Character Profile View
- Step 7: Creating a Custom Tool for Story Context
- Step 8: Wiring the Tool into the Session
- Step 9: Adding a Story Arc Generator
- Step 10: Polish and Final Integration
- Where to Go From Here?
Getting Started
Let’s create the project and configure it for the Foundation Models framework.
- Open Xcode 26 and create a new project using the App template.
- Set the product name to PixarPlotAssistant.
- Ensure the interface is SwiftUI and the language is Swift.
- Set the deployment target to iOS 26.0.
The Foundation Models framework ships with iOS 26 and requires no additional Swift packages. However, it does require the device to support Apple Intelligence. Let’s add an availability check right away.
Open PixarPlotAssistantApp.swift and replace its contents with:
import SwiftUI
@main
struct PixarPlotAssistantApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
Now open ContentView.swift and replace it with a simple availability guard:
import SwiftUI
import FoundationModels
struct ContentView: View {
var body: some View {
if #available(iOS 26, *) {
PlotAssistantView()
} else {
Text("Pixar Plot Assistant requires iOS 26 or later.")
.font(.headline)
.padding()
}
}
}
We reference PlotAssistantView here, which we’ll build in the coming steps. For now, create a placeholder file named
PlotAssistantView.swift and add:
import SwiftUI
struct PlotAssistantView: View {
var body: some View {
Text("Pixar Plot Assistant")
.font(.largeTitle)
}
}
Checkpoint: Build and run your project. You should see “Pixar Plot Assistant” displayed in large text at the center of the screen. If you see the fallback message about iOS 26, ensure your deployment target and Simulator are set to iOS 26.
Step 1: Defining the Data Models
Before we wire up any AI, we need data models that represent our chat messages and the structured outputs the model will produce. A clear data layer makes everything downstream cleaner.
Create a new file called Models/ChatMessage.swift and add the following:
import Foundation
struct ChatMessage: Identifiable {
let id = UUID()
let role: Role
var content: String
let timestamp: Date
enum Role {
case user
case assistant
}
init(role: Role, content: String) {
self.role = role
self.content = content
self.timestamp = .now
}
}
Each message has a role (the user or the assistant), content that may grow as the model streams tokens, and a
timestamp for ordering. The Identifiable conformance makes it easy to use in SwiftUI lists.
Next, create Models/PixarCharacter.swift. This model will hold the structured data we extract from the AI later using
@Generable:
import Foundation
import FoundationModels
@Generable
struct PixarCharacter {
@Guide(description: "The character's full name")
var name: String
@Guide(description: "The character's species or type, e.g., toy, monster, fish, robot, emotion")
var species: String
@Guide(description: "A one-sentence personality summary")
var personality: String
@Guide(description: "The character's core motivation or desire that drives the story")
var motivation: String
@Guide(description: "The character's greatest fear or internal conflict")
var flaw: String
@Guide(description: "A memorable catchphrase the character might say")
var catchphrase: String
}
The @Generable macro tells Foundation Models that this struct can be generated as structured output. Each @Guide
annotation provides the model with a description of what that field should contain — think of them as instructions for
each slot the AI needs to fill.
Finally, create Models/StoryArc.swift for our story arc generator:
import Foundation
import FoundationModels
@Generable
struct StoryArc {
@Guide(description: "A compelling title for the Pixar movie")
var movieTitle: String
@Guide(description: "The story's setting — where and when it takes place")
var setting: String
@Guide(description: "The inciting incident that kicks off the adventure")
var incitingIncident: String
@Guide(description: "The main conflict or challenge the characters face")
var centralConflict: String
@Guide(description: "The emotional low point where all seems lost")
var darkestMoment: String
@Guide(description: "How the characters overcome the conflict and what they learn")
var resolution: String
@Guide(description: "The emotional theme or moral of the story, in one sentence")
var theme: String
}
This mirrors classic Pixar storytelling structure. From the inciting incident in Finding Nemo (Nemo gets taken) to the darkest moment in Toy Story 3 (the incinerator scene), every great Pixar film hits these beats. Our AI will generate them all in a single structured response.
Checkpoint: Build the project (Cmd+B). Everything should compile without errors. You won’t see any visible changes yet — we’re laying the foundation that the AI engine will use.
Step 2: Building the Chat Engine
Now for the heart of the app: a class that manages a
LanguageModelSession and handles
sending messages, receiving responses, and maintaining conversation context.
Create a new file called Engine/ChatEngine.swift:
import Foundation
import FoundationModels
import Observation
@Observable
@MainActor
class ChatEngine {
var messages: [ChatMessage] = []
var isGenerating = false
var errorMessage: String?
private var session: LanguageModelSession
init() {
self.session = LanguageModelSession(
instructions: """
You are the Pixar Plot Assistant, a creative writing partner \
who specializes in brainstorming Pixar-style animated movie \
plots. You love heartfelt stories about unlikely friendships, \
hidden worlds, and characters who grow by facing their fears. \
Keep your responses fun, imaginative, and family-friendly. \
When describing characters, think about what makes Pixar \
characters iconic: clear motivations, endearing flaws, and \
memorable catchphrases. Reference real Pixar films when it \
helps illustrate a point.
"""
)
}
}
The LanguageModelSession initializer takes a system instructions string that shapes how the model behaves throughout
the conversation. We set up our assistant as a Pixar storytelling expert. The @Observable macro from the Observation
framework lets SwiftUI automatically track changes to messages and isGenerating.
Now add the method that sends a user prompt and gets a complete response:
// Add this method inside ChatEngine
func send(_ text: String) async {
let userMessage = ChatMessage(role: .user, content: text)
messages.append(userMessage)
isGenerating = true
errorMessage = nil
do {
let response = try await session.respond(to: text)
let assistantMessage = ChatMessage(
role: .assistant,
content: response.content
)
messages.append(assistantMessage)
} catch let error as LanguageModelSession.GenerationError {
errorMessage = "Generation failed: \(error.localizedDescription)"
} catch {
errorMessage = "Unexpected error: \(error.localizedDescription)"
}
isGenerating = false
}
The respond(to:) method sends the user’s text to the on-device model and returns the full response once generation is
complete. The session automatically maintains conversation history, so follow-up questions work naturally — just like
chatting with a co-writer who remembers everything you’ve discussed.
Note: The
LanguageModelSessionkeeps track of conversation context internally. Each call torespond(to:)includes the full conversation history, so the model can reference earlier messages. You don’t need to manage history yourself.
Step 3: Creating the Chat Interface
With our engine ready, let’s build the chat UI. We’ll create a scrollable message list with a text input at the bottom — a familiar pattern from any messaging app.
Open PlotAssistantView.swift and replace its contents:
import SwiftUI
struct PlotAssistantView: View {
@State private var engine = ChatEngine()
@State private var inputText = ""
@State private var selectedTab = 0
var body: some View {
TabView(selection: $selectedTab) {
Tab("Chat", systemImage: "bubble.left.and.bubble.right", value: 0) {
chatView
}
Tab("Characters", systemImage: "person.2", value: 1) {
CharacterProfileView(engine: engine)
}
Tab("Story Arc", systemImage: "book", value: 2) {
StoryArcView(engine: engine)
}
}
}
private var chatView: some View {
NavigationStack {
VStack(spacing: 0) {
messageList
inputBar
}
.navigationTitle("Pixar Plot Assistant")
.navigationBarTitleDisplayMode(.inline)
}
}
}
We’re using a TabView with three tabs: a free-form chat, a character profile generator, and a story arc generator. The
CharacterProfileView and StoryArcView don’t exist yet — we’ll build them in later steps.
Add the message list and input bar below the body:
// Add these computed properties inside PlotAssistantView
private var messageList: some View {
ScrollViewReader { proxy in
ScrollView {
LazyVStack(alignment: .leading, spacing: 12) {
if engine.messages.isEmpty {
welcomePrompt
}
ForEach(engine.messages) { message in
MessageBubble(message: message)
.id(message.id)
}
if engine.isGenerating {
ProgressView("Brainstorming...")
.padding()
}
}
.padding()
}
.onChange(of: engine.messages.count) {
if let last = engine.messages.last {
withAnimation {
proxy.scrollTo(last.id, anchor: .bottom)
}
}
}
}
}
private var welcomePrompt: some View {
VStack(spacing: 16) {
Image(systemName: "sparkles")
.font(.system(size: 48))
.foregroundStyle(.purple)
Text("Welcome to Pixar Plot Assistant!")
.font(.title2.bold())
Text("Ask me to brainstorm movie plots, create characters, or develop story arcs. Try something like:")
.font(.subheadline)
.foregroundStyle(.secondary)
.multilineTextAlignment(.center)
VStack(alignment: .leading, spacing: 8) {
SuggestionChip("What if toys could talk?")
SuggestionChip("Create a villain who's actually sympathetic")
SuggestionChip("Pitch a Pixar movie set inside a library")
}
}
.padding(.vertical, 40)
}
private var inputBar: some View {
HStack(spacing: 12) {
TextField("Brainstorm a Pixar plot...", text: $inputText)
.textFieldStyle(.roundedBorder)
.onSubmit { sendMessage() }
Button {
sendMessage()
} label: {
Image(systemName: "arrow.up.circle.fill")
.font(.title2)
}
.disabled(inputText.isEmpty || engine.isGenerating)
}
.padding()
.background(.bar)
}
private func sendMessage() {
let text = inputText.trimmingCharacters(in: .whitespacesAndNewlines)
guard !text.isEmpty else { return }
inputText = ""
Task {
await engine.send(text)
}
}
Now create two small supporting views. Add Views/MessageBubble.swift:
import SwiftUI
struct MessageBubble: View {
let message: ChatMessage
var body: some View {
HStack {
if message.role == .user { Spacer(minLength: 60) }
Text(message.content)
.padding(12)
.background(
message.role == .user
? Color.purple.opacity(0.2)
: Color(.systemGray6)
)
.clipShape(RoundedRectangle(cornerRadius: 16))
if message.role == .assistant { Spacer(minLength: 60) }
}
}
}
And Views/SuggestionChip.swift:
import SwiftUI
struct SuggestionChip: View {
let text: String
init(_ text: String) {
self.text = text
}
var body: some View {
Text(text)
.font(.callout)
.padding(.horizontal, 12)
.padding(.vertical, 8)
.background(Color.purple.opacity(0.1))
.clipShape(Capsule())
}
}
Checkpoint: Build and run the app. You should see a purple sparkles icon with the welcome prompt and suggestion chips. Type “Pitch a Pixar movie about a forgotten crayon” and tap send. After a moment, the assistant should respond with a creative Pixar-style plot idea. The message bubbles should appear in a scrolling chat with user messages on the right (purple tint) and assistant messages on the left (gray).
Step 4: Streaming Responses Token-by-Token
Right now, the user sends a message and waits for the entire response to appear at once. That’s a fine approach, but for a creative writing assistant, watching words flow onto the screen feels much more natural — like a co-writer brainstorming in real time. Let’s add streaming.
Open Engine/ChatEngine.swift and add a new method:
// Add this method inside ChatEngine
func sendStreaming(_ text: String) async {
let userMessage = ChatMessage(role: .user, content: text)
messages.append(userMessage)
isGenerating = true
errorMessage = nil
// Create a placeholder assistant message that we'll fill incrementally
let assistantMessage = ChatMessage(role: .assistant, content: "")
messages.append(assistantMessage)
let assistantIndex = messages.count - 1
do {
let stream = try session.streamResponse(to: text)
for try await token in stream {
messages[assistantIndex].content += token
}
} catch let error as LanguageModelSession.GenerationError {
errorMessage = "Generation failed: \(error.localizedDescription)"
} catch {
errorMessage = "Unexpected error: \(error.localizedDescription)"
}
isGenerating = false
}
The key difference is streamResponse(to:) instead of respond(to:). This returns an AsyncSequence of strings — each
one a chunk of the model’s output. We append a placeholder assistant message and then update its content as each token
arrives. Because ChatEngine is @Observable, SwiftUI automatically re-renders the message bubble with every new
token.
Tip: Streaming is particularly helpful for longer responses like plot outlines or character backstories where the model might generate several paragraphs. Without streaming, the user would stare at a loading indicator for 10-15 seconds — with it, they see progress immediately.
Now update the sendMessage function in PlotAssistantView.swift to use streaming:
// In PlotAssistantView, update sendMessage()
private func sendMessage() {
let text = inputText.trimmingCharacters(in: .whitespacesAndNewlines)
guard !text.isEmpty else { return }
inputText = ""
Task {
await engine.sendStreaming(text) // ← Changed to streaming
}
}
Checkpoint: Build and run. Send the message “Describe a Pixar movie where kitchen appliances come to life at night.” Watch the response appear word by word in the assistant bubble, streaming in real time like WALL-E slowly spelling out “E-V-A.” The text should flow smoothly without any flickering.
Step 5: Extracting Structured Character Profiles with @Generable
Free-form chat is great for brainstorming, but sometimes you want structured output — a character profile with specific
fields filled in. This is where @Generable shines. We defined our PixarCharacter struct with @Generable back in
Step 1. Now let’s use it.
Open Engine/ChatEngine.swift and add a method that generates a structured character:
// Add this method inside ChatEngine
func generateCharacter(from description: String) async -> PixarCharacter? {
isGenerating = true
errorMessage = nil
do {
let session = LanguageModelSession(
instructions: """
You are a Pixar character designer. Given a character concept, \
generate a detailed character profile suitable for a Pixar \
animated film. The character should feel like they belong in \
the Pixar universe — endearing, layered, and memorable. Think \
about what makes characters like Woody, Dory, and WALL-E \
so beloved: clear motivations, lovable flaws, and lines that \
audiences quote for decades.
"""
)
let response = try await session.respond(
to: description,
generating: PixarCharacter.self
)
isGenerating = false
return response.content
} catch {
errorMessage = "Character generation failed: \(error.localizedDescription)"
isGenerating = false
return nil
}
}
Notice the generating: PixarCharacter.self parameter — this tells the session to constrain its output to match our
@Generable struct exactly. Instead of free-form text, the model fills in each field guided by the @Guide
descriptions we wrote. The result is a fully populated PixarCharacter instance, not a string you have to parse.
We use a separate LanguageModelSession here with its own instructions tailored to character design. This keeps the
character generation focused without contaminating the main chat session’s context.
Apple Docs:
@Generable— Foundation Models
Now add a property to store generated characters:
// Add this property to ChatEngine
var generatedCharacters: [PixarCharacter] = []
And a convenience method that generates and stores:
// Add this method inside ChatEngine
func createCharacter(from prompt: String) async {
if let character = await generateCharacter(from: prompt) {
generatedCharacters.append(character)
}
}
Step 6: Building the Character Profile View
Let’s build the UI for our Characters tab. This view lets the user describe a character concept and receive a structured profile card.
Create Views/CharacterProfileView.swift:
import SwiftUI
struct CharacterProfileView: View {
let engine: ChatEngine
@State private var characterPrompt = ""
var body: some View {
NavigationStack {
VStack(spacing: 0) {
characterList
characterInputBar
}
.navigationTitle("Character Lab")
.navigationBarTitleDisplayMode(.inline)
}
}
private var characterList: some View {
ScrollView {
LazyVStack(spacing: 16) {
if engine.generatedCharacters.isEmpty && !engine.isGenerating {
emptyState
}
ForEach(
Array(engine.generatedCharacters.enumerated()),
id: \.offset
) { _, character in
CharacterCard(character: character)
}
if engine.isGenerating {
ProgressView("Designing character...")
.padding()
}
}
.padding()
}
}
private var emptyState: some View {
VStack(spacing: 12) {
Image(systemName: "person.crop.rectangle.badge.plus")
.font(.system(size: 44))
.foregroundStyle(.purple)
Text("No Characters Yet")
.font(.title3.bold())
Text("Describe a character concept and the AI will generate a full Pixar-style profile. Try:")
.font(.subheadline)
.foregroundStyle(.secondary)
.multilineTextAlignment(.center)
Text("\"A shy octopus who works as a librarian in an underwater city\"")
.font(.callout.italic())
.foregroundStyle(.purple)
}
.padding(.vertical, 40)
}
private var characterInputBar: some View {
HStack(spacing: 12) {
TextField(
"Describe a character concept...",
text: $characterPrompt
)
.textFieldStyle(.roundedBorder)
.onSubmit { createCharacter() }
Button {
createCharacter()
} label: {
Image(systemName: "wand.and.stars")
.font(.title2)
}
.disabled(characterPrompt.isEmpty || engine.isGenerating)
}
.padding()
.background(.bar)
}
private func createCharacter() {
let prompt = characterPrompt.trimmingCharacters(
in: .whitespacesAndNewlines
)
guard !prompt.isEmpty else { return }
characterPrompt = ""
Task {
await engine.createCharacter(from: prompt)
}
}
}
Now create the card component at Views/CharacterCard.swift:
import SwiftUI
struct CharacterCard: View {
let character: PixarCharacter
var body: some View {
VStack(alignment: .leading, spacing: 12) {
// Header
HStack {
Image(systemName: "person.fill")
.font(.title2)
.foregroundStyle(.white)
.frame(width: 44, height: 44)
.background(Color.purple.gradient)
.clipShape(Circle())
VStack(alignment: .leading) {
Text(character.name)
.font(.headline)
Text(character.species)
.font(.subheadline)
.foregroundStyle(.secondary)
}
Spacer()
}
Divider()
// Profile fields
profileRow(icon: "brain.head.profile", label: "Personality", value: character.personality)
profileRow(icon: "heart.fill", label: "Motivation", value: character.motivation)
profileRow(icon: "exclamationmark.triangle", label: "Flaw", value: character.flaw)
// Catchphrase
HStack(alignment: .top, spacing: 8) {
Image(systemName: "quote.opening")
.foregroundStyle(.purple)
Text(character.catchphrase)
.font(.callout.italic())
Image(systemName: "quote.closing")
.foregroundStyle(.purple)
}
.padding(.top, 4)
}
.padding()
.background(Color(.systemBackground))
.clipShape(RoundedRectangle(cornerRadius: 16))
.shadow(color: .black.opacity(0.1), radius: 8, y: 4)
}
private func profileRow(
icon: String,
label: String,
value: String
) -> some View {
HStack(alignment: .top, spacing: 8) {
Image(systemName: icon)
.foregroundStyle(.purple)
.frame(width: 20)
VStack(alignment: .leading, spacing: 2) {
Text(label)
.font(.caption)
.foregroundStyle(.secondary)
Text(value)
.font(.callout)
}
}
}
}
Checkpoint: Build and run the app. Switch to the Characters tab. You should see the empty state with a person icon and the suggestion to describe a character. Type “A nervous lamp who’s afraid of the dark but works as a lighthouse keeper” and tap the wand button. After a few seconds, a character card should appear with a name, species, personality, motivation, flaw, and a catchphrase — all structured and beautifully laid out. The card should feel like something you’d see on a Pixar character design sheet, like the ones they create for Woody or Buzz before animation begins.
Step 7: Creating a Custom Tool for Story Context
Here’s where things get really interesting. Foundation Models supports the
Tool protocol, which lets the model call functions
in your app to retrieve data it needs. Think of it like giving Remy from Ratatouille access to a pantry — the model
can reach for ingredients (data) when crafting its responses.
We’ll build a tool that provides the model with a catalog of classic Pixar movies, so it can reference real films when brainstorming new plots.
Create Tools/PixarMovieCatalogTool.swift:
import Foundation
import FoundationModels
struct PixarMovie {
let title: String
let year: Int
let theme: String
let protagonistArchetype: String
let emotionalCore: String
}
@Generable
struct MovieQuery {
@Guide(description: "The theme or topic to search for in the Pixar catalog, e.g., 'friendship', 'family', 'adventure'")
var theme: String
}
@Generable
struct MovieCatalogResult {
@Guide(description: "A formatted summary of relevant Pixar movies matching the query, with their themes and emotional cores")
var summary: String
}
struct PixarMovieCatalogTool: Tool {
let name = "searchPixarCatalog"
let description = """
Searches a catalog of Pixar movies by theme or topic. \
Returns matching movies with their themes, protagonist \
archetypes, and emotional cores. Use this to reference \
real Pixar films when brainstorming new stories.
"""
func call(arguments: MovieQuery) async throws -> MovieCatalogResult {
let catalog: [PixarMovie] = [
PixarMovie(
title: "Toy Story",
year: 1995,
theme: "friendship, jealousy, belonging",
protagonistArchetype: "The leader afraid of being replaced",
emotionalCore: "You are loved even when you feel replaced"
),
PixarMovie(
title: "Finding Nemo",
year: 2003,
theme: "parenthood, overprotection, letting go",
protagonistArchetype: "The anxious parent",
emotionalCore: "Love means trusting those you care about"
),
PixarMovie(
title: "The Incredibles",
year: 2004,
theme: "family, identity, hiding your gifts",
protagonistArchetype: "The hero forced into normalcy",
emotionalCore: "Being extraordinary is a family affair"
),
PixarMovie(
title: "WALL-E",
year: 2008,
theme: "loneliness, environmentalism, love",
protagonistArchetype: "The lonely optimist",
emotionalCore: "Connection is worth crossing the universe for"
),
PixarMovie(
title: "Up",
year: 2009,
theme: "grief, adventure, unexpected friendship",
protagonistArchetype: "The grieving recluse",
emotionalCore: "Adventure is out there — and so is healing"
),
PixarMovie(
title: "Inside Out",
year: 2015,
theme: "emotions, growing up, sadness",
protagonistArchetype: "The emotion who thinks she knows best",
emotionalCore: "Sadness is essential to emotional health"
),
PixarMovie(
title: "Coco",
year: 2017,
theme: "family, death, memory, music",
protagonistArchetype: "The dreamer defying tradition",
emotionalCore: "The dead are only gone when forgotten"
),
PixarMovie(
title: "Soul",
year: 2020,
theme: "purpose, passion, what makes life worth living",
protagonistArchetype: "The achiever who missed the point",
emotionalCore: "Your spark isn't your purpose — it's your love of living"
),
PixarMovie(
title: "Turning Red",
year: 2022,
theme: "adolescence, cultural identity, mother-daughter",
protagonistArchetype: "The people-pleaser finding her own identity",
emotionalCore: "Growing up means embracing all of who you are"
),
PixarMovie(
title: "Elemental",
year: 2023,
theme: "immigration, cross-cultural love, identity",
protagonistArchetype: "The dutiful child torn between worlds",
emotionalCore: "Differences can be strengths, not barriers"
),
]
let queryTheme = arguments.theme.lowercased()
let matches = catalog.filter { movie in
movie.theme.lowercased().contains(queryTheme) ||
movie.title.lowercased().contains(queryTheme) ||
movie.emotionalCore.lowercased().contains(queryTheme)
}
let results = matches.isEmpty ? catalog.prefix(3) : matches.prefix(5)
let summary = results.map { movie in
"""
\(movie.title) (\(movie.year))
Theme: \(movie.theme)
Protagonist: \(movie.protagonistArchetype)
Emotional core: \(movie.emotionalCore)
"""
}.joined(separator: "\n\n")
return MovieCatalogResult(
summary: matches.isEmpty
? "No exact matches found. Here are some reference films:\n\n\(summary)"
: "Found \(matches.count) relevant film(s):\n\n\(summary)"
)
}
}
The Tool protocol requires three things: a name the model uses to invoke it, a description the model reads to
decide when to use it, and a call(arguments:) method that does the actual work. The arguments and return type must
both be @Generable so the model can produce the input and understand the output.
Our catalog is hardcoded for simplicity, but in a production app you could query a database, call an API, or search a local SwiftData store.
Apple Docs:
Tool— Foundation Models
Step 8: Wiring the Tool into the Session
Now we need to give the model access to our tool. This happens when we create the session and when we send messages.
Open Engine/ChatEngine.swift and update the class to include a tool-powered send method:
// Add this method inside ChatEngine
func sendWithTools(_ text: String) async {
let userMessage = ChatMessage(role: .user, content: text)
messages.append(userMessage)
isGenerating = true
errorMessage = nil
let assistantMessage = ChatMessage(role: .assistant, content: "")
messages.append(assistantMessage)
let assistantIndex = messages.count - 1
do {
let tools: [any Tool] = [PixarMovieCatalogTool()]
let stream = try session.streamResponse(
to: text,
tools: tools
)
for try await token in stream {
messages[assistantIndex].content += token
}
} catch {
errorMessage = "Tool-powered generation failed: \(error.localizedDescription)"
}
isGenerating = false
}
The only addition is the tools parameter passed to streamResponse(to:tools:). When the model encounters a question
about Pixar films — like “What Pixar movies explore the theme of family?” — it can automatically invoke our
PixarMovieCatalogTool, get the results, and incorporate them into its response. The tool calls happen transparently
within the stream.
Now update PlotAssistantView.swift to use tool-powered sending for the chat tab. Replace the sendMessage function:
// In PlotAssistantView, update sendMessage()
private func sendMessage() {
let text = inputText.trimmingCharacters(in: .whitespacesAndNewlines)
guard !text.isEmpty else { return }
inputText = ""
Task {
await engine.sendWithTools(text) // ← Now uses tools
}
}
Checkpoint: Build and run the app. In the Chat tab, type “I want to write a Pixar movie about grief — what existing Pixar films handle that theme well, and how can I do something different?” The model should now reference specific films from our catalog — you should see mentions of Up and its themes of grief and unexpected friendship, plus Coco and its exploration of death and memory. The response will weave these real references into its brainstorming, like a well-read co-writer who’s studied the entire Pixar filmography. Without the tool, the model would rely only on its training data; with it, we guarantee accurate, structured references.
Step 9: Adding a Story Arc Generator
Our third tab generates a complete story arc using the StoryArc struct we defined earlier. This combines structured
output with our Pixar storytelling knowledge.
Add a story arc generation method to Engine/ChatEngine.swift:
// Add these properties and methods to ChatEngine
var generatedStoryArcs: [StoryArc] = []
var storyArcGenerationProgress: String = ""
func generateStoryArc(from concept: String) async {
isGenerating = true
errorMessage = nil
storyArcGenerationProgress = "Brainstorming story beats..."
do {
let session = LanguageModelSession(
instructions: """
You are a Pixar story architect. Given a movie concept, \
generate a complete story arc following classic Pixar \
storytelling structure. Every great Pixar film has: \
a vivid setting, an inciting incident that disrupts the \
status quo, a central conflict that tests the characters, \
a darkest moment where all hope seems lost (like the \
incinerator in Toy Story 3 or Bing Bong's sacrifice in \
Inside Out), and a resolution that delivers emotional \
catharsis. The theme should be a universal truth about \
the human experience.
"""
)
let tools: [any Tool] = [PixarMovieCatalogTool()]
let response = try await session.respond(
to: """
Create a complete Pixar movie story arc based on this concept: \
\(concept). Use the Pixar catalog tool to reference existing \
films for inspiration, but make the story original.
""",
generating: StoryArc.self,
tools: tools
)
generatedStoryArcs.append(response.content)
storyArcGenerationProgress = ""
} catch {
errorMessage = "Story arc generation failed: \(error.localizedDescription)"
storyArcGenerationProgress = ""
}
isGenerating = false
}
Here we combine three powerful features: a specialized session with screenwriter instructions, @Generable structured
output for the StoryArc, and tool access so the model can reference real Pixar films. The result is a fully structured
story arc with every beat filled in.
Now create Views/StoryArcView.swift:
import SwiftUI
struct StoryArcView: View {
let engine: ChatEngine
@State private var conceptText = ""
var body: some View {
NavigationStack {
VStack(spacing: 0) {
arcList
arcInputBar
}
.navigationTitle("Story Architect")
.navigationBarTitleDisplayMode(.inline)
}
}
private var arcList: some View {
ScrollView {
LazyVStack(spacing: 20) {
if engine.generatedStoryArcs.isEmpty && !engine.isGenerating {
emptyState
}
ForEach(
Array(engine.generatedStoryArcs.enumerated()),
id: \.offset
) { _, arc in
StoryArcCard(arc: arc)
}
if engine.isGenerating {
VStack(spacing: 8) {
ProgressView()
Text(engine.storyArcGenerationProgress)
.font(.caption)
.foregroundStyle(.secondary)
}
.padding()
}
}
.padding()
}
}
private var emptyState: some View {
VStack(spacing: 12) {
Image(systemName: "book.pages")
.font(.system(size: 44))
.foregroundStyle(.purple)
Text("No Story Arcs Yet")
.font(.title3.bold())
Text("Describe a movie concept and the AI will generate a complete Pixar-style story arc. Try:")
.font(.subheadline)
.foregroundStyle(.secondary)
.multilineTextAlignment(.center)
Text("\"A world where dreams are manufactured in a factory\"")
.font(.callout.italic())
.foregroundStyle(.purple)
}
.padding(.vertical, 40)
}
private var arcInputBar: some View {
HStack(spacing: 12) {
TextField("Describe a movie concept...", text: $conceptText)
.textFieldStyle(.roundedBorder)
.onSubmit { generateArc() }
Button {
generateArc()
} label: {
Image(systemName: "theatermasks")
.font(.title2)
}
.disabled(conceptText.isEmpty || engine.isGenerating)
}
.padding()
.background(.bar)
}
private func generateArc() {
let concept = conceptText.trimmingCharacters(
in: .whitespacesAndNewlines
)
guard !concept.isEmpty else { return }
conceptText = ""
Task {
await engine.generateStoryArc(from: concept)
}
}
}
Create the card component at Views/StoryArcCard.swift:
import SwiftUI
struct StoryArcCard: View {
let arc: StoryArc
var body: some View {
VStack(alignment: .leading, spacing: 16) {
// Title
HStack {
Image(systemName: "film")
.foregroundStyle(.purple)
Text(arc.movieTitle)
.font(.title3.bold())
}
Divider()
// Story beats
storyBeat(
number: 1,
label: "Setting",
icon: "globe",
text: arc.setting
)
storyBeat(
number: 2,
label: "Inciting Incident",
icon: "bolt.fill",
text: arc.incitingIncident
)
storyBeat(
number: 3,
label: "Central Conflict",
icon: "flame.fill",
text: arc.centralConflict
)
storyBeat(
number: 4,
label: "Darkest Moment",
icon: "cloud.bolt.rain.fill",
text: arc.darkestMoment
)
storyBeat(
number: 5,
label: "Resolution",
icon: "sun.max.fill",
text: arc.resolution
)
Divider()
// Theme
HStack(alignment: .top, spacing: 8) {
Image(systemName: "heart.text.clipboard")
.foregroundStyle(.purple)
VStack(alignment: .leading, spacing: 4) {
Text("Theme")
.font(.caption)
.foregroundStyle(.secondary)
Text(arc.theme)
.font(.callout.italic())
}
}
}
.padding()
.background(Color(.systemBackground))
.clipShape(RoundedRectangle(cornerRadius: 16))
.shadow(color: .black.opacity(0.1), radius: 8, y: 4)
}
private func storyBeat(
number: Int,
label: String,
icon: String,
text: String
) -> some View {
HStack(alignment: .top, spacing: 12) {
ZStack {
Circle()
.fill(Color.purple.opacity(0.15))
.frame(width: 32, height: 32)
Text("\(number)")
.font(.caption.bold())
.foregroundStyle(.purple)
}
VStack(alignment: .leading, spacing: 4) {
HStack(spacing: 4) {
Image(systemName: icon)
.font(.caption)
.foregroundStyle(.purple)
Text(label)
.font(.caption)
.foregroundStyle(.secondary)
}
Text(text)
.font(.callout)
}
}
}
}
Checkpoint: Build and run the app. Navigate to the Story Arc tab. Enter “A world where forgotten toys from different eras band together to find their way back to children who still believe in imagination” and tap the masks button. After several seconds, a complete story arc card should appear with a movie title, setting, five story beats from inciting incident through resolution, and a theme. Each beat should read like a genuine Pixar screenplay treatment — with the emotional depth of Inside Out and the adventure of Up.
Step 10: Polish and Final Integration
Let’s add some finishing touches: error handling UI, the ability to reset conversations, and a way to use suggestion chips interactively.
First, update PlotAssistantView.swift to make the suggestion chips tappable and add error display:
// Replace the SuggestionChip usages in PlotAssistantView's welcomePrompt
private var welcomePrompt: some View {
VStack(spacing: 16) {
Image(systemName: "sparkles")
.font(.system(size: 48))
.foregroundStyle(.purple)
Text("Welcome to Pixar Plot Assistant!")
.font(.title2.bold())
Text("Ask me to brainstorm movie plots, create characters, or develop story arcs. Try something like:")
.font(.subheadline)
.foregroundStyle(.secondary)
.multilineTextAlignment(.center)
VStack(alignment: .leading, spacing: 8) {
tappableSuggestion("What if toys could talk?")
tappableSuggestion("Create a villain who's actually sympathetic")
tappableSuggestion("Pitch a Pixar movie set inside a library")
}
}
.padding(.vertical, 40)
}
private func tappableSuggestion(_ text: String) -> some View {
Button {
inputText = text
sendMessage()
} label: {
SuggestionChip(text)
}
.buttonStyle(.plain)
}
Now add a reset capability and error display. Update the chatView property:
// Replace chatView in PlotAssistantView
private var chatView: some View {
NavigationStack {
VStack(spacing: 0) {
if let error = engine.errorMessage {
errorBanner(error)
}
messageList
inputBar
}
.navigationTitle("Pixar Plot Assistant")
.navigationBarTitleDisplayMode(.inline)
.toolbar {
ToolbarItem(placement: .topBarTrailing) {
Button {
engine.resetConversation()
} label: {
Image(systemName: "arrow.counterclockwise")
}
}
}
}
}
private func errorBanner(_ message: String) -> some View {
HStack {
Image(systemName: "exclamationmark.triangle.fill")
.foregroundStyle(.yellow)
Text(message)
.font(.caption)
Spacer()
Button("Dismiss") {
engine.errorMessage = nil
}
.font(.caption.bold())
}
.padding(10)
.background(Color.red.opacity(0.1))
}
Add the resetConversation method to Engine/ChatEngine.swift:
// Add this method inside ChatEngine
func resetConversation() {
messages.removeAll()
errorMessage = nil
session = LanguageModelSession(
instructions: """
You are the Pixar Plot Assistant, a creative writing partner \
who specializes in brainstorming Pixar-style animated movie \
plots. You love heartfelt stories about unlikely friendships, \
hidden worlds, and characters who grow by facing their fears. \
Keep your responses fun, imaginative, and family-friendly. \
When describing characters, think about what makes Pixar \
characters iconic: clear motivations, endearing flaws, and \
memorable catchphrases. Reference real Pixar films when it \
helps illustrate a point.
"""
)
}
Creating a new LanguageModelSession clears the conversation history. The model starts fresh with no memory of previous
messages — like starting a brand new brainstorming session with your co-writer.
Finally, let’s add a model availability check at app launch. Update Engine/ChatEngine.swift to include:
// Add this property and method to ChatEngine
var isModelAvailable = false
func checkAvailability() async {
let availability = LanguageModelSession.Availability.current
switch availability {
case .available:
isModelAvailable = true
case .unavailable:
isModelAvailable = false
errorMessage = "On-device AI is not available on this device. Apple Intelligence must be enabled in Settings."
default:
isModelAvailable = false
errorMessage = "On-device AI model status unknown. Please check Settings > Apple Intelligence."
}
}
Update PlotAssistantView.swift to check availability on appear:
// Add this modifier to the TabView in PlotAssistantView's body
var body: some View {
TabView(selection: $selectedTab) {
Tab("Chat", systemImage: "bubble.left.and.bubble.right", value: 0) {
chatView
}
Tab("Characters", systemImage: "person.2", value: 1) {
CharacterProfileView(engine: engine)
}
Tab("Story Arc", systemImage: "book", value: 2) {
StoryArcView(engine: engine)
}
}
.task {
await engine.checkAvailability()
}
}
Checkpoint: Build and run the complete app. Verify all three tabs work:
- Chat tab: Tap “What if toys could talk?” — the suggestion chip should immediately send the message and you should see a streaming response about a Pixar-like toy story concept. Tap the reset button (counterclockwise arrow) to clear the conversation.
- Characters tab: Enter “A grumpy old flashlight in a hardware store who secretly dreams of being a lighthouse” — you should get a structured character card with name, species, personality, motivation, flaw, and catchphrase.
- Story Arc tab: Enter “An undersea world where coral reefs are actually living cities, and one young polyp must save their reef from an ancient threat” — you should get a full five-beat story arc card with a movie title and theme.
If Apple Intelligence is not available, you should see an error banner explaining the requirement. The app should handle all errors gracefully without crashing, even if generation fails mid-stream.
Where to Go From Here?
Congratulations! You’ve built Pixar Plot Assistant — an on-device AI app that brainstorms movie plots, generates structured character profiles, and creates complete story arcs, all without sending a single byte to the cloud.
Here’s what you learned:
- How to create and configure a
LanguageModelSessionwith system instructions that shape the model’s behavior - How to stream responses token-by-token using
streamResponse(to:)for a responsive, real-time UI - How to use
@Generablestructs with@Guideannotations to extract structured data from the model - How to implement the
Toolprotocol to give the model access to your app’s data - How to combine tools and structured output in a single generation call
- How to check model availability and handle errors gracefully
Ideas for extending this project:
- Persist characters and story arcs using SwiftData so they survive app launches. See our SwiftData tutorial for guidance.
- Add a “Pitch Deck” export that combines a story arc with its characters into a shareable PDF.
- Implement conversation branching — let users fork a conversation to explore multiple plot directions from a single prompt.
- Add more tools — a “Pixar Tropes” tool that suggests common Pixar storytelling devices, or a “Name Generator” tool that invents character names based on personality traits.
- Integrate with App Intents so users can say “Hey Siri, brainstorm a Pixar plot about friendship” and get a response. See Build an App with Siri and App Intents for how.