Swift 6.2 New Features: Default Main Actor, InlineArray, and More


Swift 6.0 introduced strict concurrency checking, and the community response was clear: the safety guarantees were right, but the migration pain was too high. Swift 6.2, shipping with Xcode 26, rewrites the developer experience around concurrency while adding stack-allocated collections, typed notifications, and WebAssembly support. This is the release where Swift’s concurrency story finally becomes approachable.

This post covers the six most impactful features in Swift 6.2 and how they change the way you write production code. We won’t rehash the fundamentals of actors or async/await — if you need a refresher, start with Migrating to Swift 6 and Approachable Concurrency.

Contents

The Problem

Swift 6.0’s strict concurrency model was correct but adversarial. A typical SwiftUI app — where the vast majority of code touches UI — required @MainActor annotations on almost everything. Engineers spent more time fixing isolation errors than writing features.

Consider a typical view model before Swift 6.2:

@MainActor // Required, or every UI update is an error
final class MovieListViewModel: ObservableObject {
    @Published private(set) var movies: [Movie] = []

    func loadMovies() async {
        let fetched = await MovieService.fetchAll()
        movies = fetched // Safe — we're on @MainActor
    }

    nonisolated func computeHash(for data: Data) -> String {
        // Must be marked nonisolated to run off the main actor
        data.sha256()
    }
}

Every class, every view model, every coordinator needed that @MainActor annotation — or you’d fight a wall of compiler diagnostics. The ratio of boilerplate to intent was too high.

Default Main Actor Isolation

This is the headline change in Swift 6.2, announced during WWDC25 session “What’s New in Swift” (#245). When you enable default main actor isolation, all code defaults to running on the main actor unless you explicitly opt out.

Enable it in your Package.swift via the defaultIsolation setting (SE-0466):

// Package.swift
.executableTarget(
    name: "PixarStudio",
    swiftSettings: [
        .defaultIsolation(MainActor.self)
    ]
)

With this setting enabled, the previous view model becomes dramatically simpler:

// No @MainActor needed — it's the default
final class MovieListViewModel: ObservableObject {
    @Published private(set) var movies: [Movie] = []

    func loadMovies() async {
        let fetched = await MovieService.fetchAll()
        movies = fetched
    }
}

The compiler treats every function, class, and struct as @MainActor-isolated unless you explicitly mark it otherwise. This inverts the annotation burden: instead of annotating the 90% of code that touches UI, you only annotate the 10% that genuinely needs to run off the main thread.

How It Affects Existing Code

Default main actor isolation is an opt-in setting, not a breaking change. Your existing Swift 6.0/6.1 code continues to compile. When you enable it:

  • Functions without explicit isolation become @MainActor by default.
  • Protocols and their conformances inherit main actor isolation.
  • Global variables become main actor-isolated, eliminating a whole class of Sendable errors.
  • Code inside packages you depend on is not affected — the flag applies only to your module.

Warning: Enabling defaultIsolation(MainActor.self) can surface new warnings if you have functions that were previously nonisolated by default and are now being called from a concurrent context. Audit your call sites incrementally — enable the setting on one target at a time.

@concurrent for Explicit Background Work

With main actor as the default, you need a way to say “this function should run off the main thread.” That’s exactly what the new @concurrent attribute does.

final class RenderPipeline {
    // Runs on main actor by default (with defaultIsolation enabled)
    func displayFrame(_ frame: Frame) {
        renderView.image = frame.image
    }

    // Explicitly opts out — runs on the cooperative thread pool
    @concurrent
    func renderFrame(
        scene: Scene,
        at resolution: Resolution
    ) async -> Frame {
        let geometry = scene.tessellate(at: resolution)
        let lighting = await computeGlobalIllumination(for: geometry)
        return Frame(image: rasterize(geometry, lighting: lighting))
    }
}

@concurrent is the semantic inverse of @MainActor. Where Swift 6.0 made you annotate main actor code, Swift 6.2 makes you annotate background code. The key insight: most apps have far less background code than UI code, so this dramatically reduces total annotations.

@concurrent vs. nonisolated

These two are related but not identical:

final class AssetManager {
    // nonisolated: inherits caller's execution context
    nonisolated func cacheKey(for asset: Asset) -> String {
        asset.id.uuidString
    }

    // @concurrent: explicitly runs on the cooperative thread pool
    @concurrent
    func processAsset(_ asset: Asset) async -> ProcessedAsset {
        await asset.decompress().normalize()
    }
}

A nonisolated synchronous function runs wherever the caller runs. A @concurrent function always runs on the cooperative thread pool. Use nonisolated for lightweight, synchronous utilities that don’t need a specific executor. Use @concurrent for genuinely parallel, long-running work.

Tip: Think of @concurrent as documentation of intent: “This function is deliberately off the main actor because it does heavy work.”

InlineArray<N, Element>

SE-0453 introduces InlineArray, a fixed-size, stack-allocated collection type. Unlike Array, which heap-allocates its buffer, InlineArray stores elements directly inline — no reference counting, no heap allocation.

// A fixed-size collection of 4 RGBA color channels
var pixel: InlineArray<4, UInt8> = [255, 128, 64, 255]
pixel[0] = 200

// Works with custom types too
struct AnimationKeyframe {
    let time: Double
    let position: SIMD3<Float>
}

// 30 keyframes, all on the stack
var keyframes = InlineArray<30, AnimationKeyframe>(repeating: .init(
    time: 0,
    position: .zero
))

InlineArray supports subscript access and for-in iteration out of the box. The size N is part of the type, enforced at compile time. Note that InlineArray does not formally conform to Sequence or Collection — this is a deliberate design choice to avoid implicit copies of the inline storage.

When InlineArray Outperforms Array

The performance advantage is most pronounced for small, fixed-size collections accessed in tight loops:

actor RenderFarm {
    // Each vertex has exactly 3 components — never more, never less
    struct Vertex {
        var position: InlineArray<3, Float>
        var normal: InlineArray<3, Float>
        var uv: InlineArray<2, Float>
    }

    @concurrent
    func transformVertices(
        _ vertices: inout [Vertex],
        matrix: float4x4
    ) {
        for i in vertices.indices {
            // No heap allocation per vertex — stored inline
            vertices[i].position = applyTransform(
                vertices[i].position, matrix
            )
        }
    }
}

Because InlineArray is a value type stored inline, there’s no heap allocation and no ARC overhead. For a mesh with millions of vertices, this eliminates millions of retain/release calls.

Apple Docs: InlineArray — Swift Standard Library

Observations Async Sequence

The new Observations type bridges the @Observable macro with structured concurrency. Instead of using withObservationTracking and manually re-registering after each change, you get a proper AsyncSequence of state snapshots.

@Observable
class StudioDashboard {
    var activeRenders: Int = 0
    var completedFrames: Int = 0
    var currentScene: String = "Toy Story 5 — Woody's Workshop"
}

Consuming changes becomes a standard for await loop:

let dashboard = StudioDashboard()

// Observations tracks which properties you read in the closure
let renderCounts = Observations { dashboard.activeRenders }

for await activeCount in renderCounts {
    statusLabel.text = "Active renders: \(activeCount)"
}

This replaces the awkward recursive pattern that withObservationTracking required:

// Before Swift 6.2 — manual re-registration
func trackChanges() {
    withObservationTracking {
        _ = dashboard.activeRenders
    } onChange: {
        DispatchQueue.main.async {
            self.updateUI()
            self.trackChanges() // Must re-register manually
        }
    }
}

The Observations sequence automatically re-tracks after each yielded value, handles cancellation through structured concurrency, and delivers values on the caller’s isolation context.

Tip: Observations pairs naturally with SwiftUI’s task modifier. Attach a for await loop inside .task { } and SwiftUI handles cancellation when the view disappears.

Concrete Notification Types

Swift 6.2 replaces stringly-typed Notification.Name constants with concrete types that carry their payload in a type-safe way. No more casting userInfo dictionaries.

// Define a typed notification with its payload
struct RenderCompleteNotification: NotificationProtocol {
    struct Payload {
        let sceneID: UUID
        let frameCount: Int
        let duration: Duration
    }
}

Posting and observing become fully type-safe:

// Posting — the payload is a concrete type, not [AnyHashable: Any]
NotificationCenter.default.post(
    RenderCompleteNotification(payload: .init(
        sceneID: scene.id,
        frameCount: 2400,
        duration: .seconds(3600)
    ))
)

// Observing — no casting, no string keys
for await notification in NotificationCenter.default.notifications(
    of: RenderCompleteNotification.self
) {
    let payload = notification.payload
    print("Rendered \(payload.frameCount) frames")
}

This eliminates an entire category of runtime crashes caused by misspelled notification names or incorrect userInfo key casts. The compiler verifies that the payload type matches at both the posting and observing sites.

Note: Apple’s own frameworks are adopting concrete notification types in iOS 26. Existing Notification.Name-based APIs continue to work — this is an additive change.

WebAssembly Support

Swift 6.2 adds official WebAssembly (Wasm) support, making Swift a viable language for browser-based applications. The Swift SDK for WebAssembly includes the full standard library, Foundation, and even Observation.

// A SwiftWasm module — compiles to .wasm
import Foundation

struct MovieDatabase {
    var movies: [Movie] = []

    mutating func add(_ movie: Movie) {
        movies.append(movie)
    }

    func search(title: String) -> [Movie] {
        movies.filter {
            $0.title.localizedCaseInsensitiveContains(title)
        }
    }
}

Build for Wasm using the Swift SDK:

swift build --swift-sdk wasm32-unknown-wasi

WebAssembly support opens the door to sharing Swift business logic between your iOS app and a web client, running Swift-based tools in the browser, and server-side Swift in edge computing environments that support Wasm runtimes.

Note: SwiftUI is not available on WebAssembly. Wasm targets are best suited for shared business logic, algorithms, and data processing — not UI rendering.

Performance Considerations

The features in Swift 6.2 have concrete performance implications worth understanding:

Default main actor isolation adds no runtime overhead for code that was already running on the main thread. If you were manually annotating @MainActor everywhere, this is a zero-cost abstraction — the behavior is identical, the annotation is just implicit. Code that was previously nonisolated by default and now becomes main actor-isolated may see thread hops where there were none before. Profile with Instruments’ Swift Concurrency template if you notice unexpected latency.

InlineArray delivers its performance advantage through elimination of heap allocation and reference counting. For a struct containing InlineArray<3, Float> versus Array<Float>, the inline version avoids a heap allocation per instance. In tight loops processing thousands of elements, this can mean a 2-5x throughput improvement. However, large InlineArray values (e.g., InlineArray<1000, SomeLargeStruct>) increase stack frame size and can cause stack overflows. Keep the element count reasonable for stack-allocated data.

Observations is efficient because it only tracks the properties you actually read inside the closure. Reading dashboard.activeRenders means changes to dashboard.completedFrames won’t trigger a new value. This is more efficient than KVO or NotificationCenter patterns that fire for any change.

When to Use (and When Not To)

FeatureUse WhenAvoid When
defaultIsolationNew project or UI-heavy appLibrary code needing isolation-agnostic API
@concurrentHeavy computation (rendering, parsing)Short synchronous utilities
InlineArray<N, E>Fixed-size, small, perf-critical pathsDynamic or large (> ~64 elements)
Observations@Observable tracking outside SwiftUIInside SwiftUI views (automatic)
Concrete notificationsNew notification definitionsWrapping existing Notification.Name APIs
WebAssemblySharing Swift logic with web clientsUI code (SwiftUI unavailable on Wasm)

Summary

  • Default main actor isolation inverts the annotation burden: annotate the 10% of background code instead of the 90% of UI code. Enable it with .defaultIsolation(MainActor.self) in your Package.swift.
  • @concurrent is the explicit opt-out for background work, replacing the need for widespread nonisolated annotations.
  • InlineArray<N, Element> brings stack-allocated, fixed-size collections to Swift, eliminating heap overhead in performance-critical code paths.
  • Observations finally gives @Observable a proper AsyncSequence interface, replacing the manual withObservationTracking re-registration pattern.
  • Concrete notification types end the era of stringly-typed Notification.Name and unsafe userInfo casts.
  • WebAssembly support makes Swift a cross-platform language that reaches beyond Apple’s ecosystem.

Swift 6.2 is the release that makes Swift concurrency feel like it was designed for real-world apps, not just whitepapers. If you’ve been holding off on strict concurrency adoption, this is the version that makes the migration worthwhile. For a deeper dive into the concurrency model changes specifically, see Approachable Concurrency.