Build an AI Writing Assistant with Foundation Models and SwiftUI


What if your app could suggest the next Pixar sequel — without ever touching a server? With Apple’s on-device language model, that idea ships in your binary and runs entirely in the user’s pocket.

In this tutorial, you’ll build Pixar Story Assistant — a complete SwiftUI app that uses Foundation Models to brainstorm sequel ideas with structured output, summarize long story pitches, and rephrase text in three different tones, all processing happening on-device with no data leaving the user’s device. Along the way, you’ll learn how to stream language model responses token by token, define @Generable schemas for structured output, and manage conversation history within a token budget.

Prerequisites

  • Xcode 16+ with an iOS 26 / macOS 26 deployment target
  • Familiarity with Foundation Models — specifically LanguageModelSession and SystemLanguageModel
  • Familiarity with SwiftUI state management@State, @Observable, and environment propagation
  • Familiarity with async/awaitTask, for await in, and structured concurrency

Note: Foundation Models requires a device or simulator running iOS 26+ / macOS 26+. The on-device model is only available when SystemLanguageModel.default reports .available. This tutorial includes a graceful fallback for unsupported configurations.

Contents

Getting Started

Start by creating a new Xcode project:

  1. Open Xcode and select File > New > Project.
  2. Choose the App template under the iOS tab.
  3. Set the product name to PixarStoryAssistant.
  4. Ensure the interface is SwiftUI and the language is Swift.
  5. Click Next, choose a location, and click Create.

Once the project opens, update the deployment target. Select the project root in the Navigator, choose the PixarStoryAssistant target, and set Minimum Deployments to iOS 26.0.

Foundation Models does not require any additional entitlements or Info.plist keys — the framework ships as part of the OS and is available automatically when SystemLanguageModel.default is .available.

Checking Model Availability at Launch

Before invoking any model APIs, you need to verify that the on-device model is ready. Open PixarStoryAssistantApp.swift and replace its contents with the following:

import SwiftUI
import FoundationModels

@main
struct PixarStoryAssistantApp: App {
    var body: some Scene {
        WindowGroup {
            RootView()
        }
    }
}

Now create a new file at Views/RootView.swift and add:

import SwiftUI
import FoundationModels

@available(iOS 26, macOS 26, *)
struct RootView: View {
    var body: some View {
        switch SystemLanguageModel.default.availability {
        case .available:
            ContentView()
        case .unavailable(let reason):
            UnavailableView(reason: reason)
        }
    }
}

The SystemLanguageModel type exposes an availability property that returns either .available or .unavailable(reason:). The reason value is a localized string you can surface to the user — common causes include the device model being too old, the model still downloading, or parental controls restricting generative features.

Create Views/UnavailableView.swift with a simple fallback screen:

import SwiftUI
import FoundationModels

@available(iOS 26, macOS 26, *)
struct UnavailableView: View {
    let reason: SystemLanguageModel.Availability.UnavailabilityReason

    var body: some View {
        ContentUnavailableView(
            "On-Device AI Unavailable",
            systemImage: "brain.slash",
            description: Text(reason.localizedDescription)
        )
    }
}

Checkpoint: Build and run on an iOS 26 simulator. If the model is available, you’ll briefly see a blank ContentView. If unavailable, you’ll see the ContentUnavailableView fallback. Either result confirms the availability check is wired up correctly.

Step 1: The Assistant Manager

The PixarAssistantManager is the single source of truth for all AI interactions. It owns the LanguageModelSession, drives streaming responses, and exposes the three features — brainstorm, summarize, and rephrase — as async methods.

Create a new file at Models/PixarAssistantManager.swift:

import Foundation
import Observation
import FoundationModels

@available(iOS 26, macOS 26, *)
@Observable
@MainActor
final class PixarAssistantManager {

    // MARK: - Published state

    /// The text currently being streamed from the model.
    private(set) var streamingResponse: String = ""

    /// Whether the manager is actively waiting for or streaming a response.
    private(set) var isGenerating: Bool = false

    /// The most recent error, if any.
    private(set) var lastError: LanguageModelError?

    // MARK: - Session management

    /// The active personality mode — controls the system prompt.
    var personalityMode: PersonalityMode = .neutral {
        didSet { rebuildSession() }
    }

    private var session: LanguageModelSession

    init() {
        session = Self.makeSession(for: .neutral)
    }

    // MARK: - Session factory

    private static func makeSession(
        for mode: PersonalityMode
    ) -> LanguageModelSession {
        LanguageModelSession(
            model: .default,
            instructions: mode.systemPrompt
        )
    }

    private func rebuildSession() {
        session = Self.makeSession(for: personalityMode)
    }
}

LanguageModelSession represents a single conversation with the on-device model. The instructions parameter sets the system prompt — a string the model uses to shape its personality and behavior. You’ll revisit this in Step 6 when you build the personality mode picker.

Note: @Observable and @MainActor together ensure that all property mutations drive SwiftUI updates on the main thread without any additional DispatchQueue.main.async calls. This is the Swift 6 concurrency-safe pattern for view models.

Defining the Personality Mode

Add the PersonalityMode enum in the same file, below the class:

@available(iOS 26, macOS 26, *)
enum PersonalityMode: String, CaseIterable, Identifiable {
    case neutral = "Pixar Story Assistant"
    case woody   = "Woody"
    case buzz    = "Buzz Lightyear"
    case dory    = "Dory"

    var id: String { rawValue }

    var systemPrompt: String {
        switch self {
        case .neutral:
            return """
            You are a creative assistant for Pixar story development. \
            You help brainstorm sequels, improve story pitches, and develop \
            characters. Be imaginative, concise, and inspiring.
            """
        case .woody:
            return """
            You are a creative assistant who speaks like Woody from Toy Story \
            — warm, enthusiastic, and with a cowboy spirit. Use occasional \
            western phrases. Keep ideas grounded in heart and friendship.
            """
        case .buzz:
            return """
            You are a creative assistant who speaks like Buzz Lightyear — \
            precise, mission-focused, and confident. Use space ranger \
            terminology. Structure ideas like tactical briefings.
            """
        case .dory:
            return """
            You are a creative assistant who speaks like Dory from Finding \
            Nemo — enthusiastic, optimistic, and prone to delightful tangents. \
            Circle back to the main point after each tangent.
            """
        }
    }
}

Adding the Three Feature Methods

Now add the three core async methods. Still inside PixarAssistantManager, append the following after the rebuildSession helper:

// MARK: - Feature: Free-form streaming

/// Sends a prompt and streams the response into `streamingResponse`.
func send(_ prompt: String) async {
    guard !isGenerating else { return }
    isGenerating = true
    streamingResponse = ""
    lastError = nil

    do {
        let stream = session.streamResponse(to: prompt)
        for try await partialResponse in stream {
            // .content holds the full accumulated text so far
            streamingResponse = partialResponse.content
        }
    } catch let error as LanguageModelError {
        lastError = error
    } catch {
        // Surface unexpected errors as generation failures
        streamingResponse = "Something went wrong. Please try again."
    }

    isGenerating = false
}

// MARK: - Feature: Summarize

/// Asks the model to distill `text` to two sentences.
func summarize(_ text: String) async {
    let prompt = """
    Summarize the following Pixar story pitch in exactly two sentences. \
    Capture the emotional core and the central conflict.

    Story pitch:
    \(text)
    """
    await send(prompt)
}

// MARK: - Feature: Rephrase

/// Rephrases `text` according to the chosen `RephraseMode`.
func rephrase(_ text: String, mode: RephraseMode) async {
    let prompt = """
    Rephrase the following text in \(mode.instruction). \
    Return only the rephrased text with no extra commentary.

    Original:
    \(text)
    """
    await send(prompt)
}

You will add the RephraseMode enum in Step 5 and the LanguageModelError UI extension in Step 8. For now the file will have a compiler warning about the missing type — that is expected.

Step 2: The Chat Interface

With the manager in place, you can build the main chat UI. The interface shows a scrolling list of messages, a text input field, and a toolbar containing the feature action buttons.

Defining the Message Model

Create Models/AssistantMessage.swift:

import Foundation

struct AssistantMessage: Identifiable {
    let id = UUID()
    let role: Role
    let text: String

    enum Role {
        case user
        case assistant
    }
}

Building the Message Bubble

Create Views/MessageBubble.swift:

import SwiftUI

struct MessageBubble: View {
    let message: AssistantMessage

    var body: some View {
        HStack {
            if message.role == .user { Spacer(minLength: 60) }

            Text(message.text)
                .padding(.horizontal, 14)
                .padding(.vertical, 10)
                .background(bubbleColor)
                .foregroundStyle(foregroundColor)
                .clipShape(RoundedRectangle(cornerRadius: 18))

            if message.role == .assistant { Spacer(minLength: 60) }
        }
    }

    private var bubbleColor: Color {
        message.role == .user ? Color.accentColor : Color(.secondarySystemBackground)
    }

    private var foregroundColor: Color {
        message.role == .user ? .white : .primary
    }
}

Building the Chat View

Create Views/ContentView.swift (replacing the generated file):

import SwiftUI
import FoundationModels

@available(iOS 26, macOS 26, *)
struct ContentView: View {

    @State private var manager = PixarAssistantManager()
    @State private var messages: [AssistantMessage] = []
    @State private var inputText: String = ""
    @State private var scrollProxy: ScrollViewProxy? = nil

    var body: some View {
        NavigationStack {
            VStack(spacing: 0) {
                messageList
                Divider()
                inputBar
            }
            .navigationTitle(manager.personalityMode.rawValue)
            .navigationBarTitleDisplayMode(.inline)
            .toolbar { toolbarContent }
        }
    }

    // MARK: - Message list

    private var messageList: some View {
        ScrollViewReader { proxy in
            ScrollView {
                LazyVStack(spacing: 8) {
                    ForEach(messages) { message in
                        MessageBubble(message: message)
                            .padding(.horizontal)
                            .id(message.id)
                    }
                    // Streaming bubble while generating
                    if manager.isGenerating {
                        streamingBubble
                            .padding(.horizontal)
                            .id("streaming")
                    }
                }
                .padding(.vertical, 12)
            }
            .onAppear { scrollProxy = proxy }
            .onChange(of: manager.streamingResponse) {
                withAnimation { proxy.scrollTo("streaming", anchor: .bottom) }
            }
            .onChange(of: messages.count) {
                if let last = messages.last {
                    withAnimation { proxy.scrollTo(last.id, anchor: .bottom) }
                }
            }
        }
    }

    private var streamingBubble: some View {
        HStack {
            Text(manager.streamingResponse.isEmpty ? "..." : manager.streamingResponse)
                .padding(.horizontal, 14)
                .padding(.vertical, 10)
                .background(Color(.secondarySystemBackground))
                .clipShape(RoundedRectangle(cornerRadius: 18))
            Spacer(minLength: 60)
        }
    }

    // MARK: - Input bar

    private var inputBar: some View {
        HStack(spacing: 10) {
            TextField("Ask about Pixar stories…", text: $inputText, axis: .vertical)
                .lineLimit(1...5)
                .textFieldStyle(.plain)
                .padding(.horizontal, 12)
                .padding(.vertical, 8)
                .background(Color(.secondarySystemBackground))
                .clipShape(RoundedRectangle(cornerRadius: 20))

            Button {
                submitMessage()
            } label: {
                Image(systemName: "arrow.up.circle.fill")
                    .font(.title2)
                    .foregroundStyle(inputText.trimmingCharacters(in: .whitespaces).isEmpty
                                     ? Color.secondary : Color.accentColor)
            }
            .disabled(inputText.trimmingCharacters(in: .whitespaces).isEmpty
                      || manager.isGenerating)
        }
        .padding(.horizontal)
        .padding(.vertical, 10)
    }

    // MARK: - Toolbar

    @ToolbarContentBuilder
    private var toolbarContent: some ToolbarContent {
        ToolbarItem(placement: .topBarLeading) {
            Menu {
                ForEach(PersonalityMode.allCases) { mode in
                    Button(mode.rawValue) {
                        manager.personalityMode = mode
                    }
                }
            } label: {
                Image(systemName: "person.crop.circle")
            }
        }
        ToolbarItem(placement: .topBarTrailing) {
            Button("Clear") {
                messages = []
            }
            .disabled(messages.isEmpty)
        }
    }

    // MARK: - Actions

    private func submitMessage() {
        let trimmed = inputText.trimmingCharacters(in: .whitespaces)
        guard !trimmed.isEmpty else { return }

        let userMessage = AssistantMessage(role: .user, text: trimmed)
        messages.append(userMessage)
        inputText = ""

        Task {
            await manager.send(trimmed)
            let reply = AssistantMessage(role: .assistant, text: manager.streamingResponse)
            messages.append(reply)
        }
    }
}

The key streaming mechanism is in the messageList view. The .onChange(of: manager.streamingResponse) modifier fires every time the manager receives a new partial response (each containing the full accumulated text so far), and proxy.scrollTo("streaming") keeps the viewport pinned to the latest text. Once generation finishes, isGenerating flips to false, the streaming bubble disappears, and the completed message is appended to messages.

Checkpoint: Build and run. You should see the Pixar Story Assistant chat screen. Type a message like “What could happen in a Toy Story 5?” and tap the send button. You should see the response stream in token by token inside the assistant bubble, with the view automatically scrolling to keep the latest text visible.

Step 3: Brainstorm Feature with Structured Output

Free-form chat is useful, but the real power of Foundation Models is structured output. By annotating a Swift type with @Generable, you can ask the model to produce JSON that maps directly to your type — no parsing required.

Defining the Output Schema

Create Models/SequelIdeas.swift:

import FoundationModels

@available(iOS 26, macOS 26, *)
@Generable
struct SequelIdeas {

    @Guide(description: "Three distinct sequel concepts for the requested Pixar movie")
    var concepts: [String]

    @Guide(description: "The core emotional themes explored across all three concepts")
    var themes: [String]

    @Guide(description: "A single suggested working title for the most compelling concept")
    var suggestedTitle: String
}

@Generable synthesizes a JSON schema from your Swift type at compile time. The @Guide attribute attaches a natural-language description to each property, giving the model context about what each field should contain. The model uses both the property name and the description to fill in values — treat descriptions as instructions, not documentation.

Adding the Brainstorm Method to the Manager

Open Models/PixarAssistantManager.swift and add the following method after the rephrase method:

// MARK: - Feature: Brainstorm (structured output)

/// Generates structured sequel ideas for the given Pixar movie title.
/// Returns nil if generation fails or the model is unavailable.
func brainstorm(movie: String) async -> SequelIdeas? {
    guard !isGenerating else { return nil }
    isGenerating = true
    lastError = nil

    defer { isGenerating = false }

    let prompt = """
    Generate three creative sequel ideas for the Pixar movie "\(movie)". \
    Consider what new emotional journey the characters could take, \
    what new settings could be explored, and what universal theme \
    would resonate with audiences of all ages.
    """

    do {
        let response = try await session.respond(
            to: prompt,
            generating: SequelIdeas.self
        )
        return response.content
    } catch let error as LanguageModelError {
        lastError = error
        return nil
    } catch {
        // Non-LanguageModelError failures (e.g., task cancellation)
        return nil
    }
}

Notice the difference from the streaming send method: session.respond(to:generating:) returns the fully materialized SequelIdeas value rather than a stream of strings. Under the hood, Foundation Models generates valid JSON that conforms to the schema derived from @Generable, then deserializes it for you.

Building the Brainstorm Sheet

Create Views/BrainstormView.swift:

import SwiftUI
import FoundationModels

@available(iOS 26, macOS 26, *)
struct BrainstormView: View {

    @Environment(\.dismiss) private var dismiss
    let manager: PixarAssistantManager

    @State private var movieTitle: String = ""
    @State private var ideas: SequelIdeas? = nil

    var body: some View {
        NavigationStack {
            Form {
                Section("Which Pixar movie needs a sequel?") {
                    TextField("e.g. Coco, Brave, Up…", text: $movieTitle)
                }

                if manager.isGenerating {
                    Section {
                        HStack {
                            ProgressView()
                            Text("Generating ideas…")
                                .foregroundStyle(.secondary)
                        }
                    }
                }

                if let ideas {
                    conceptsSection(ideas)
                    themesSection(ideas)
                    titleSection(ideas)
                }
            }
            .navigationTitle("Brainstorm Sequel")
            .navigationBarTitleDisplayMode(.inline)
            .toolbar {
                ToolbarItem(placement: .cancellationAction) {
                    Button("Cancel") { dismiss() }
                }
                ToolbarItem(placement: .confirmationAction) {
                    Button("Generate") {
                        Task {
                            ideas = await manager.brainstorm(movie: movieTitle)
                        }
                    }
                    .disabled(movieTitle.trimmingCharacters(in: .whitespaces).isEmpty
                              || manager.isGenerating)
                }
            }
        }
    }

    private func conceptsSection(_ ideas: SequelIdeas) -> some View {
        Section("Sequel Concepts") {
            ForEach(Array(ideas.concepts.enumerated()), id: \.offset) { index, concept in
                VStack(alignment: .leading, spacing: 4) {
                    Text("Concept \(index + 1)")
                        .font(.caption)
                        .foregroundStyle(.secondary)
                    Text(concept)
                }
                .padding(.vertical, 4)
            }
        }
    }

    private func themesSection(_ ideas: SequelIdeas) -> some View {
        Section("Core Themes") {
            ForEach(ideas.themes, id: \.self) { theme in
                Label(theme, systemImage: "heart.fill")
                    .foregroundStyle(.pink)
            }
        }
    }

    private func titleSection(_ ideas: SequelIdeas) -> some View {
        Section("Suggested Title") {
            Text(ideas.suggestedTitle)
                .font(.headline)
        }
    }
}

Wiring the Brainstorm Sheet into ContentView

Open Views/ContentView.swift. Add a @State private var showingBrainstorm = false property, then add a toolbar button and sheet modifier:

// In the toolbarContent computed property, add a new ToolbarItem:
ToolbarItem(placement: .topBarTrailing) {
    Button {
        showingBrainstorm = true
    } label: {
        Image(systemName: "sparkles")
    }
}

Then attach the sheet to the outer NavigationStack:

.sheet(isPresented: $showingBrainstorm) {
    BrainstormView(manager: manager)
}

Checkpoint: Build and run. Tap the sparkles toolbar button to open the brainstorm sheet. Type “Coco” and tap Generate. After a few seconds you should see three distinct sequel concepts rendered as cards, a list of core themes, and a suggested working title — all structured and populated by the on-device model without a network request.

Step 4: Summarize Feature

Long story pitches are hard to evaluate at a glance. The summarize feature lets users paste a multi-paragraph pitch and get a two-sentence distillation in return.

Building the Summarize Sheet

Create Views/SummarizeView.swift:

import SwiftUI
import FoundationModels

@available(iOS 26, macOS 26, *)
struct SummarizeView: View {

    @Environment(\.dismiss) private var dismiss
    let manager: PixarAssistantManager

    @State private var pitchText: String = ""
    @State private var summary: String = ""

    private var characterCount: Int { pitchText.count }
    private var summaryCharacterCount: Int { summary.count }

    var body: some View {
        NavigationStack {
            Form {
                Section {
                    TextEditor(text: $pitchText)
                        .frame(minHeight: 140)
                } header: {
                    Text("Your Story Pitch")
                } footer: {
                    Text("\(characterCount) characters")
                        .foregroundStyle(.secondary)
                }

                if manager.isGenerating {
                    Section("Summary") {
                        streamingPreview
                    }
                } else if !summary.isEmpty {
                    summarySection
                }
            }
            .navigationTitle("Summarize Pitch")
            .navigationBarTitleDisplayMode(.inline)
            .toolbar {
                ToolbarItem(placement: .cancellationAction) {
                    Button("Cancel") { dismiss() }
                }
                ToolbarItem(placement: .confirmationAction) {
                    Button("Summarize") {
                        summary = ""
                        Task {
                            await manager.summarize(pitchText)
                            summary = manager.streamingResponse
                        }
                    }
                    .disabled(pitchText.trimmingCharacters(in: .whitespaces).isEmpty
                              || manager.isGenerating)
                }
            }
        }
    }

    private var streamingPreview: some View {
        Text(manager.streamingResponse.isEmpty ? "Thinking…" : manager.streamingResponse)
            .foregroundStyle(.secondary)
            .animation(.easeIn, value: manager.streamingResponse)
    }

    private var summarySection: some View {
        Section {
            Text(summary)
                .padding(.vertical, 4)
        } header: {
            Text("Summary")
        } footer: {
            HStack {
                Text("Reduced from \(characterCount) to \(summaryCharacterCount) characters")
                Spacer()
                let reduction = characterCount > 0
                    ? Int((1 - Double(summaryCharacterCount) / Double(characterCount)) * 100)
                    : 0
                Text("\(reduction)% shorter")
                    .foregroundStyle(.green)
                    .bold()
            }
        }
    }
}

The summary view streams the response into the form section in real time using manager.streamingResponse, then commits the final text to the local summary state when generation finishes. The footer shows the character reduction as a percentage, giving users a satisfying sense of how much the model compressed their pitch.

Wire the sheet into ContentView by adding another toolbar button:

// Add to toolbarContent
ToolbarItem(placement: .topBarTrailing) {
    Button {
        showingSummarize = true
    } label: {
        Image(systemName: "doc.text.magnifyingglass")
    }
}

Add the corresponding state property and sheet modifier:

@State private var showingSummarize = false

// In the body, alongside the brainstorm sheet:
.sheet(isPresented: $showingSummarize) {
    SummarizeView(manager: manager)
}

Tip: The summarize method in PixarAssistantManager builds the prompt for you — you never need to pass raw model instructions from the view. Keeping prompt engineering in the manager makes it easy to iterate on prompt wording without touching the UI layer.

Step 5: Rephrase Feature

The rephrase feature offers three transformation modes: more formal for a professional pitch document, more exciting for a movie-trailer-style logline, and simpler for a child-friendly retelling.

Defining Rephrase Modes

Create Models/RephraseMode.swift:

import Foundation

enum RephraseMode: String, CaseIterable, Identifiable {
    case formal   = "More Formal"
    case exciting = "More Exciting"
    case simpler  = "Simpler"

    var id: String { rawValue }

    var instruction: String {
        switch self {
        case .formal:
            return "a professional, polished tone suitable for a studio pitch document"
        case .exciting:
            return "dramatic movie-trailer language — high energy, punchy sentences, \
                    evocative imagery"
        case .simpler:
            return "simple, friendly language a child aged 7–10 could easily understand"
        }
    }

    var icon: String {
        switch self {
        case .formal:   return "briefcase.fill"
        case .exciting: return "flame.fill"
        case .simpler:  return "star.fill"
        }
    }
}

Building the Rephrase Sheet

Create Views/RephraseView.swift:

import SwiftUI
import FoundationModels

@available(iOS 26, macOS 26, *)
struct RephraseView: View {

    @Environment(\.dismiss) private var dismiss
    let manager: PixarAssistantManager

    @State private var originalText: String = ""
    @State private var rephrasedText: String = ""
    @State private var selectedMode: RephraseMode = .exciting

    var body: some View {
        NavigationStack {
            Form {
                Section("Original Text") {
                    TextEditor(text: $originalText)
                        .frame(minHeight: 100)
                }

                Section("Tone") {
                    Picker("Rephrase Mode", selection: $selectedMode) {
                        ForEach(RephraseMode.allCases) { mode in
                            Label(mode.rawValue, systemImage: mode.icon).tag(mode)
                        }
                    }
                    .pickerStyle(.segmented)
                    .labelsHidden()
                }

                if manager.isGenerating {
                    Section("Rephrased") {
                        Text(manager.streamingResponse.isEmpty
                             ? "Rephrasing…"
                             : manager.streamingResponse)
                            .foregroundStyle(.secondary)
                    }
                } else if !rephrasedText.isEmpty {
                    comparisonSection
                }
            }
            .navigationTitle("Rephrase Text")
            .navigationBarTitleDisplayMode(.inline)
            .toolbar {
                ToolbarItem(placement: .cancellationAction) {
                    Button("Cancel") { dismiss() }
                }
                ToolbarItem(placement: .confirmationAction) {
                    Button("Rephrase") {
                        rephrasedText = ""
                        Task {
                            await manager.rephrase(originalText, mode: selectedMode)
                            rephrasedText = manager.streamingResponse
                        }
                    }
                    .disabled(originalText.trimmingCharacters(in: .whitespaces).isEmpty
                              || manager.isGenerating)
                }
            }
        }
    }

    private var comparisonSection: some View {
        Section {
            VStack(alignment: .leading, spacing: 12) {
                Group {
                    Text("Original")
                        .font(.caption.bold())
                        .foregroundStyle(.secondary)
                    Text(originalText)
                        .padding(10)
                        .background(Color(.tertiarySystemBackground))
                        .clipShape(RoundedRectangle(cornerRadius: 8))
                }
                Group {
                    Text("Rephrased (\(selectedMode.rawValue))")
                        .font(.caption.bold())
                        .foregroundStyle(.secondary)
                    Text(rephrasedText)
                        .padding(10)
                        .background(Color.accentColor.opacity(0.1))
                        .clipShape(RoundedRectangle(cornerRadius: 8))
                }
            }
            .padding(.vertical, 4)
        } header: {
            Text("Comparison")
        }
    }
}

Wire the sheet into ContentView with the same pattern as before — a toolbar button, a @State boolean, and a .sheet modifier.

Checkpoint: Build and run. Open the Rephrase sheet. Enter the text “Woody misses Buzz and decides to go find him.” Select “More Exciting” and tap Rephrase. You should see something like “In a world turned upside down, one cowboy refuses to give up — and will cross every galaxy to bring his best friend home.” The original and rephrased text appear side-by-side in the comparison section.

Step 6: Custom Personality Modes

The personality mode picker was wired into the manager back in Step 1, but the UI needs a dedicated settings view so users can understand what each mode offers before selecting it.

Building the Personality Picker Sheet

Create Views/PersonalityPickerView.swift:

import SwiftUI
import FoundationModels

@available(iOS 26, macOS 26, *)
struct PersonalityPickerView: View {

    @Environment(\.dismiss) private var dismiss
    @Binding var selectedMode: PersonalityMode

    var body: some View {
        NavigationStack {
            List(PersonalityMode.allCases) { mode in
                PersonalityRow(mode: mode, isSelected: mode == selectedMode) {
                    selectedMode = mode
                    dismiss()
                }
            }
            .navigationTitle("Assistant Voice")
            .navigationBarTitleDisplayMode(.inline)
            .toolbar {
                ToolbarItem(placement: .cancellationAction) {
                    Button("Cancel") { dismiss() }
                }
            }
        }
    }
}

@available(iOS 26, macOS 26, *)
private struct PersonalityRow: View {
    let mode: PersonalityMode
    let isSelected: Bool
    let onSelect: () -> Void

    var body: some View {
        Button(action: onSelect) {
            HStack(alignment: .top, spacing: 14) {
                Image(systemName: isSelected ? "checkmark.circle.fill" : "circle")
                    .foregroundStyle(isSelected ? Color.accentColor : .secondary)
                    .font(.title3)

                VStack(alignment: .leading, spacing: 4) {
                    Text(mode.rawValue)
                        .font(.headline)
                        .foregroundStyle(.primary)
                    Text(mode.voiceDescription)
                        .font(.subheadline)
                        .foregroundStyle(.secondary)
                        .fixedSize(horizontal: false, vertical: true)
                }
            }
            .padding(.vertical, 6)
        }
    }
}

Add a voiceDescription computed property to PersonalityMode in Models/PixarAssistantManager.swift:

var voiceDescription: String {
    switch self {
    case .neutral:
        return "Balanced, creative, and concise. Best for most brainstorming sessions."
    case .woody:
        return "Warm and heart-driven. Great for stories about friendship and belonging."
    case .buzz:
        return "Precise and mission-oriented. Ideal for structuring plot outlines."
    case .dory:
        return "Enthusiastic and tangential. Perfect for unexpected, surprising ideas."
    }
}

The key insight here is that changing manager.personalityMode triggers rebuildSession() in the manager, which creates a new LanguageModelSession with a different system prompt. The next message the user sends will use the new personality, and the conversation history from the old session is discarded — a clean slate with a new voice.

Tip: If you want to preserve conversation history across personality switches, you’d need to replay prior messages into the new session manually. Foundation Models does not automatically migrate context between sessions. For this tutorial, a fresh session per personality is the right trade-off.

Update ContentView to present the personality picker as a sheet when the person icon is tapped, passing $manager.personalityMode as a binding.

Checkpoint: Build and run. Switch from “Pixar Story Assistant” to “Buzz Lightyear” and ask “What should happen in Incredibles 3?” You should notice a distinctly different tone — structured, authoritative, and space-ranger-inflected — compared to the neutral mode.

Step 7: Conversation History Management

LanguageModelSession automatically maintains conversation history between calls to send. This is powerful, but the on-device model has a finite context window. Long conversations will eventually cause a LanguageModelError.contextWindowExceeded error. You need to detect this condition and offer the user a recovery path.

Tracking Conversation Length

While Foundation Models does not expose an exact token count, you can approximate conversation length by counting user turns. Add a property to PixarAssistantManager to surface this:

/// Number of user messages sent in the current session.
private(set) var messageCount: Int = 0

/// The warning threshold — surface a UI warning above this value.
static let messageWarningThreshold = 15

Update the send method to increment the message count after each response:

func send(_ prompt: String) async {
    guard !isGenerating else { return }
    isGenerating = true
    streamingResponse = ""
    lastError = nil

    do {
        let stream = session.streamResponse(to: prompt)
        for try await partialResponse in stream {
            streamingResponse = partialResponse.content
        }
        messageCount += 1  // ← Track conversation length
    } catch let error as LanguageModelError {
        lastError = error
    } catch {
        streamingResponse = "Something went wrong. Please try again."
    }

    isGenerating = false
}

Apple Docs: LanguageModelSession — Foundation Models

Adding a Token Warning Banner

Open Views/ContentView.swift and add a token warning banner above the message list:

private var tokenWarningBanner: some View {
    Group {
        if manager.messageCount > PixarAssistantManager.messageWarningThreshold {
            HStack(spacing: 8) {
                Image(systemName: "exclamationmark.triangle.fill")
                    .foregroundStyle(.orange)
                Text("Conversation is getting long. Consider clearing to avoid errors.")
                    .font(.caption)
                Spacer()
                Button("Clear") {
                    messages = []
                    manager.clearConversation()
                }
                .font(.caption.bold())
                .foregroundStyle(.orange)
            }
            .padding(.horizontal)
            .padding(.vertical, 8)
            .background(Color.orange.opacity(0.1))
        }
    }
}

Place tokenWarningBanner above messageList in the outer VStack.

Clearing the Conversation

Add the clearConversation method to PixarAssistantManager. Because the session holds the history, clearing it means creating a new session:

/// Resets the conversation by rebuilding the session from scratch.
func clearConversation() {
    rebuildSession()
    messageCount = 0
    streamingResponse = ""
    lastError = nil
}

Limiting History with a Rolling Window

For users who want to continue long sessions without clearing everything, you can implement a rolling window by keeping only the last N user messages. This is a trade-off — the model loses earlier context, but the conversation can continue indefinitely.

Add a configuration constant and a trimming method to the manager:

/// Maximum number of user turns to retain before trimming history.
static let maxHistoryTurns = 10

/// Trims session history to the last `maxHistoryTurns` turns.
/// Call this before sending a new message if `messageCount` is high.
private func trimHistoryIfNeeded() {
    guard messageCount > Self.messageWarningThreshold else { return }
    // Rebuilding the session discards all history — for a production app
    // you would replay the last N messages instead. Simplified for clarity.
    rebuildSession()
    messageCount = 0
}

You can call trimHistoryIfNeeded() at the top of the send method for automatic rolling-window behavior.

Checkpoint: Build and run. Start a conversation and send 15+ messages. When the message count crosses the warning threshold, the orange banner should appear. Tapping Clear resets the session and removes the banner. The conversation history in the UI is cleared, and the next message starts a fresh context.

Step 8: Polish and Error Handling

The final step brings the app to a production-ready state by handling the full range of LanguageModelError cases, adding a loading skeleton for the model initialization phase, providing a retry button, and checking thermal state before sending long requests.

Handling LanguageModelError

Foundation Models surfaces failures through LanguageModelError. Add a computed property to the error type to produce user-facing messages. Create Models/LanguageModelError+UI.swift:

import FoundationModels

@available(iOS 26, macOS 26, *)
extension LanguageModelError {

    /// A short, user-friendly description of the error.
    var userFacingMessage: String {
        switch self {
        case .modelUnavailable:
            return "The on-device model is currently unavailable. " +
                   "Try again in a moment."
        case .contextWindowExceeded:
            return "The conversation is too long for the model to process. " +
                   "Clear the conversation and try again."
        case .guardrailViolation:
            return "The request could not be completed due to content policy. " +
                   "Try rephrasing your input."
        @unknown default:
            return "Something went wrong. Please try again."
        }
    }
}

Note: LanguageModelError cases may change as the framework evolves. Always handle the default case to future-proof your error handling against new cases added in later OS versions.

Displaying Errors in the Chat View

In ContentView, add an error banner below the token warning banner:

private var errorBanner: some View {
    Group {
        if let error = manager.lastError {
            VStack(spacing: 8) {
                HStack(spacing: 8) {
                    Image(systemName: "xmark.octagon.fill")
                        .foregroundStyle(.red)
                    Text(error.userFacingMessage)
                        .font(.caption)
                    Spacer()
                }
                Button("Retry Last Message") {
                    guard let lastUserMessage = messages.last(where: {
                        $0.role == .user
                    }) else { return }
                    Task {
                        await manager.send(lastUserMessage.text)
                        let reply = AssistantMessage(
                            role: .assistant,
                            text: manager.streamingResponse
                        )
                        messages.append(reply)
                    }
                }
                .font(.caption.bold())
                .foregroundStyle(.red)
            }
            .padding(.horizontal)
            .padding(.vertical, 10)
            .background(Color.red.opacity(0.08))
        }
    }
}

Checking Thermal State Before Long Requests

On-device inference is compute-intensive. Submitting a large request when the device is already throttling can produce slow or degraded responses. Add a thermal check to the send method:

func send(_ prompt: String) async {
    guard !isGenerating else { return }

    // Warn if the device is running hot
    let thermalState = ProcessInfo.processInfo.thermalState
    if thermalState == .critical {
        // Surface a warning rather than submit the request
        streamingResponse = "Device is too hot for on-device inference. " +
                            "Let it cool down and try again."
        return
    }

    isGenerating = true
    // ... rest of the method unchanged
}

ProcessInfo.thermalState returns one of four values: .nominal, .fair, .serious, or .critical. Blocking requests only at .critical is a reasonable threshold — .serious will slow inference but is usually tolerable.

Loading Skeleton While Model Initializes

On first launch the on-device model may need a moment to load into memory. Add a skeleton placeholder that displays while isGenerating is true and messages is empty:

private var emptyStateView: some View {
    VStack(spacing: 20) {
        Image(systemName: "sparkles")
            .font(.system(size: 56))
            .foregroundStyle(Color.accentColor)

        Text("Pixar Story Assistant")
            .font(.title2.bold())

        Text("Ask me to brainstorm sequel ideas, summarize a pitch, or rephrase your writing. Everything runs on-device.")
            .font(.subheadline)
            .foregroundStyle(.secondary)
            .multilineTextAlignment(.center)
            .padding(.horizontal, 40)

        VStack(spacing: 10) {
            suggestionChip("What could Coco 2 be about?")
            suggestionChip("Summarize my story pitch")
            suggestionChip("Rephrase this in Buzz's voice")
        }
    }
    .padding()
}

private func suggestionChip(_ text: String) -> some View {
    Button {
        inputText = text
    } label: {
        Text(text)
            .font(.subheadline)
            .padding(.horizontal, 16)
            .padding(.vertical, 8)
            .background(Color(.secondarySystemBackground))
            .clipShape(Capsule())
    }
    .foregroundStyle(.primary)
}

In messageList, show emptyStateView instead of the empty LazyVStack when messages.isEmpty && !manager.isGenerating:

if messages.isEmpty && !manager.isGenerating {
    emptyStateView
        .frame(maxWidth: .infinity, maxHeight: .infinity)
} else {
    // existing LazyVStack implementation
}

Checkpoint: Build and run the complete app. You should see the empty state view with suggestion chips on launch. Tapping a chip populates the text field. Sending a message produces a streaming response. Triggering a context overflow (by filling the conversation) shows the error banner with a retry button. Switching personalities changes the assistant’s tone immediately. The brainstorm sheet produces structured sequel ideas as cards. The summarize and rephrase sheets work end-to-end with streaming output. Congratulations — the Pixar Story Assistant is complete.

Where to Go From Here?

Congratulations! You’ve built Pixar Story Assistant — a fully on-device AI writing assistant that brainstorms structured sequel ideas, summarizes long story pitches, and rephrases writing in three distinct tones, all using Apple’s Foundation Models framework without sending a single byte to an external server.

Here’s what you learned:

  • How to check SystemLanguageModel.default.availability and provide a graceful fallback UI for unsupported devices
  • How to create a LanguageModelSession with a custom system prompt and stream responses token by token using for await partialResponse in session.streamResponse(to:), accessing partialResponse.content for the accumulated text
  • How to define @Generable structs with @Guide annotations and call session.respond(to:generating:) for structured output that maps directly to your Swift types
  • How to manage conversation context by tracking message count, detect approaching limits, and clear the session cleanly
  • How to handle LanguageModelError cases (.modelUnavailable, .contextWindowExceeded, .guardrailViolation) and check ProcessInfo.thermalState before submitting compute-intensive requests
  • How to swap system prompts by rebuilding the session, producing dramatically different response personalities from the same underlying model

Ideas for extending this project:

  • Document import with QuickLook: Use QuickLookUI to let users import a PDF or text file directly into the summarize or rephrase sheets, extracting text with PDFDocument from PDFKit.
  • Export to Files: Use fileExporter to let users save brainstorm results and rephrased text as .txt or .md files to the Files app.
  • iCloud sync for conversation history: Encode your [AssistantMessage] array as JSON and store it in NSUbiquitousKeyValueStore or a CloudKit private database so the conversation history syncs across a user’s devices.
  • Home Screen Widget: Use WidgetKit to display a “Today’s Story Inspiration” widget that shows a randomly selected sequel concept from the user’s last brainstorm session.
  • Siri App Intent: Expose the brainstorm feature as an AppIntent so users can say “Hey Siri, brainstorm a Coco sequel” and get structured ideas returned as a Siri response card.