Reality Composer Pro: Building Spatial Experiences with Node-Based Tools


You have a RealityKit entity loaded in your visionOS app, but it looks like a grey blob. No iridescent Buzz Lightyear helmet, no glowing Luxo Jr. lamp filament, no swirling particle trail behind Wall-E’s fire extinguisher. The code-only path to custom materials in RealityKit is verbose, opaque, and nearly impossible to iterate on. That is exactly the gap Reality Composer Pro fills.

This post covers the four major subsystems of Reality Composer Pro: the Shader Graph material editor, the particle system editor, spatial audio configuration, and the animation timeline. We focus on how these tools integrate with RealityKit code in a visionOS app. We will not cover Reality Composer (the older iPad app) or basic USDZ viewing — those are separate tools for different workflows.

Contents

The Problem

Building custom materials entirely in code means constructing ShaderGraphMaterial from scratch, wiring up texture samplers, normal maps, and parameter bindings manually. Consider what it takes to create a simple glowing material for a Luxo Jr. lamp:

import RealityKit

// Code-only approach: verbose and hard to visualize
func createGlowMaterial() async throws -> ShaderGraphMaterial {
    var material = try await ShaderGraphMaterial(
        named: "/LuxoLamp/GlowMaterial",
        from: "LuxoScene.usda"
    )
    // What color is this? How bright? How does it fall off?
    // You cannot see the result until you build and run.
    try material.setParameter(
        name: "GlowColor",
        value: .color(.init(.yellow))
    )
    try material.setParameter(
        name: "GlowIntensity",
        value: .float(3.5)
    )
    try material.setParameter(
        name: "FalloffExponent",
        value: .float(2.0)
    )
    return material
}

Every parameter change requires a rebuild. There is no preview, no visual feedback, and no way to experiment with node connections interactively. Multiply this by every material, particle effect, and audio source in your scene, and you have a workflow that does not scale.

Reality Composer Pro solves this with a visual authoring environment that outputs .usda and .reality files directly consumable by RealityKit.

Shader Graph: Node-Based Materials

The Shader Graph editor is Reality Composer Pro’s most powerful subsystem. It uses a node-and-wire paradigm identical to Blender’s shader nodes or Unreal’s material editor. You connect input nodes (textures, constants, procedural patterns) through processing nodes (math, color operations, UV manipulation) to output nodes (surface, geometry).

Opening Shader Graph

  1. Create a new Reality Composer Pro project from Xcode: File > New > File > Reality Composer Pro Project.
  2. In Reality Composer Pro, select an entity, then click its material in the Inspector.
  3. Click Open in Shader Graph to enter the node editor.

Alternatively, create a standalone material by right-clicking in the Project Navigator and choosing New > Shader Graph Material.

Building a Pixar-Themed Material

Let us build a material for a Monsters, Inc. scare floor tile — a surface that shifts color based on viewing angle (a Fresnel effect) with a procedural noise pattern.

In Shader Graph, the node chain looks like this:

  1. Noise2D node — Generates a procedural pattern. Set the scale to 8.0 for tile-sized variation.
  2. Fresnel node — Outputs a value from 0 to 1 based on the angle between the surface normal and the camera. Edges glow brighter than centers.
  3. Mix node — Blends two colors using the Fresnel output as the factor. Set Color A to Monsters, Inc. purple (#6B3FA0) and Color B to scare-floor green (#39FF14).
  4. Multiply node — Combines the noise pattern with the mixed color for variation.
  5. Connect the Multiply output to the Surface node’s Emissive Color input.
  6. Set the Surface node’s Base Color to a dark grey so the emissive color pops.

The result is a surface that glows green at glancing angles and purple head-on, with procedural noise breaking up the uniformity.

Exposing Parameters to Swift

The key to making Shader Graph materials dynamic is promoted parameters. Right-click any input on a node and select Promote to Input. This creates a named parameter you can set from Swift at runtime.

In the Shader Graph editor, promote the Mix node’s Color A and Color B inputs. Name them PrimaryColor and SecondaryColor.

Back in your Swift code:

import RealityKit

func loadScareFloorTile() async throws -> ModelEntity {
    let entity = try await Entity.load(
        named: "ScareFloorTile",
        in: monstersIncBundle
    )

    guard var material = entity.components[ModelComponent.self]?
        .materials.first as? ShaderGraphMaterial else {
        fatalError("Expected ShaderGraphMaterial on entity")
    }

    // Dynamically change the promoted parameters
    try material.setParameter(
        name: "PrimaryColor",
        value: .color(.init(
            .init(red: 0.42, green: 0.25, blue: 0.63, alpha: 1.0)
        ))
    )
    try material.setParameter(
        name: "SecondaryColor",
        value: .color(.init(
            .init(red: 0.22, green: 1.0, blue: 0.08, alpha: 1.0)
        ))
    )

    entity.components[ModelComponent.self]?.materials = [material]
    return entity as! ModelEntity
}

Tip: Name your promoted parameters with PascalCase and descriptive names (EmissionIntensity, not param1). These names are your API contract between the visual editor and your Swift code. Renaming them later requires updating both the .usda file and every call site.

Custom Surface Shaders

For effects beyond what built-in nodes provide, Shader Graph supports Custom Function nodes backed by Metal shader code. This lets you write custom Metal functions and wire them into the graph:

// CustomShaders.metal — referenced by a Custom Function node
#include <metal_stdlib>
using namespace metal;

[[visible]]
float scareIntensity(float distance, float maxRange) {
    // Inverse-square falloff for scare energy
    float normalized = saturate(distance / maxRange);
    return 1.0 - (normalized * normalized);
}

In Shader Graph, add a Custom Function node, point it to scareIntensity, and connect its inputs and outputs to the rest of your graph.

Apple Docs: ShaderGraphMaterial — RealityKit

Particle Systems: Emitter Configuration

Reality Composer Pro’s particle editor lets you design effects visually — fire, sparks, dust, magical trails — and preview them in real time.

Creating a Particle System

  1. Select an entity in the scene hierarchy.
  2. In the Inspector, click Add Component > Particle Emitter.
  3. The particle editor opens inline with controls for emission rate, lifetime, velocity, color over lifetime, and more.

Designing Wall-E’s Fire Extinguisher Trail

A good particle system for Wall-E’s fire extinguisher blast needs a short-lived burst of white particles that fade to transparent, with slight turbulence.

Configure these properties in the particle editor:

PropertyValueRationale
Emission Rate500/secDense enough to look like gas
Lifetime0.8 secShort-lived — gas dissipates fast
Initial Speed2.0 m/sFast initial burst
Speed Damping0.7Particles slow down quickly
Color Over LifetimeWhite (100%) to White (0% alpha)Fade out, do not change color
Size Over Lifetime0.02m to 0.08mParticles expand as gas disperses
Noise Strength0.3Slight turbulence for realism
ShapeCone (15 degree angle)Directed spray, not omnidirectional

Triggering Particles from Swift

Particle systems created in Reality Composer Pro are stored as components. You enable and disable them at runtime:

import RealityKit

func fireExtinguisher(on wallE: Entity) {
    guard var particles = wallE
        .findEntity(named: "ExtinguisherNozzle")?
        .components[ParticleEmitterComponent.self] else {
        return
    }

    // Enable the burst
    particles.isEmitting = true
    wallE.findEntity(named: "ExtinguisherNozzle")?
        .components[ParticleEmitterComponent.self] = particles

    // Stop after 2 seconds
    Task {
        try? await Task.sleep(for: .seconds(2))
        particles.isEmitting = false
        wallE.findEntity(named: "ExtinguisherNozzle")?
            .components[ParticleEmitterComponent.self] = particles
    }
}

You can also modify particle properties dynamically. For example, changing the emission color when Wall-E switches from the extinguisher to a different item:

func updateParticleColor(
    on entity: Entity,
    to color: SIMD4<Float>
) {
    guard var particles = entity
        .components[ParticleEmitterComponent.self] else {
        return
    }
    particles.mainEmitter.color = .constant(.single(
        .init(
            red: color.x,
            green: color.y,
            blue: color.z,
            alpha: color.w
        )
    ))
    entity.components[ParticleEmitterComponent.self] = particles
}

Warning: High emission rates (>1000/sec) with long lifetimes can overwhelm the GPU, especially on Apple Vision Pro’s tiled rendering architecture. Profile with Instruments’ RealityKit trace template before shipping.

Spatial Audio: Positioning Sound in 3D

Spatial audio in visionOS is not stereo panning — it is physically modeled sound propagation. Reality Composer Pro lets you configure audio sources directly on entities.

Adding an Audio Source

  1. Select an entity (for example, Woody’s pull-string voice box).
  2. Add a Spatial Audio component from the Inspector.
  3. Assign an audio file (.wav, .m4a, .mp3).
  4. Configure the spatial properties:
PropertyDescription
GainVolume in decibels. 0 dB is the file’s native volume.
DirectivityHow focused the sound is. Omnidirectional (0) to laser-focused (1).
Reverb SendHow much the sound interacts with the room’s reverb.
Distance AttenuationHow quickly sound fades with distance. Offers linear, inverse, and custom curves.

Playing Spatial Audio from Swift

import RealityKit

func playWoodyVoice(on woodyEntity: Entity) async {
    guard let audioSource = woodyEntity
        .findEntity(named: "VoiceBox") else {
        return
    }

    do {
        let resource = try await AudioFileResource(
            named: "theres_a_snake_in_my_boot.m4a",
            configuration: .init(
                shouldLoop: false,
                shouldRandomizeStartTime: false
            )
        )

        let controller = audioSource.playAudio(resource)
        controller.gain = -6.0 // Slightly quieter than full volume

        // Fade in over 0.5 seconds
        controller.fade(to: .zero, duration: 0)
        controller.fade(to: -6.0, duration: 0.5)
    } catch {
        print("Failed to load audio: \(error)")
    }
}

Spatial audio sources move with their parent entity. If Woody walks across the room, the sound follows him automatically. No manual position updates required.

Apple Docs: AudioFileResource — RealityKit

Animation Timeline and Behaviors

The animation timeline in Reality Composer Pro sequences entity transformations, material parameter changes, and audio triggers on a visual timeline — similar to After Effects or Keynote’s animation panel.

Timeline Basics

  1. Open the Timeline panel (View > Timeline).
  2. Select an entity to see its animation tracks.
  3. Add keyframes for position, rotation, scale, opacity, or any promoted material parameter.

For example, animating the Luxo Jr. lamp’s iconic hop:

  • 0.0s — Lamp at rest position, rotation (0, 0, 0).
  • 0.3s — Lamp compressed (scale Y: 0.8), rotation tilted forward 15 degrees.
  • 0.5s — Lamp at apex (position Y + 0.5m), rotation 0 degrees, scale (1, 1.1, 1).
  • 0.8s — Lamp lands (position Y: 0), slight overshoot on compression (scale Y: 0.9).
  • 1.0s — Lamp at rest, scale (1, 1, 1).

Each transition uses easing curves. The hop sequence uses ease-out on the upward motion and ease-in on the landing for physically plausible motion.

Triggering Animations from Swift

Animations authored in Reality Composer Pro are stored as named animation resources. You play them through the entity’s animation system:

import RealityKit

func playLuxoHop(on lampEntity: Entity) {
    // Load the animation defined in Reality Composer Pro
    guard let animation = lampEntity.availableAnimations.first(
        where: { $0.name == "LuxoHop" }
    ) else {
        print("LuxoHop animation not found")
        return
    }

    // Play with a specific speed and blend factor
    lampEntity.playAnimation(
        animation,
        transitionDuration: 0.2,
        startsPaused: false
    )
}

Behaviors: Event-Driven Logic

Behaviors in Reality Composer Pro connect triggers to actions without code. Common patterns:

  • Tap trigger -> Play animation (the lamp hops when tapped).
  • Proximity trigger -> Enable particle emitter (sparkles appear when the user approaches).
  • Notification trigger -> Receive a named notification from Swift to start a sequence.

The notification trigger is the bridge between your app logic and Reality Composer Pro authored sequences:

import RealityKit

// Send a notification to trigger a behavior
func triggerMonsterEntrance(in scene: Entity) {
    let notification = Notification(
        name: .init("MonsterEntrance")
    )
    NotificationCenter.default.post(notification)
}

Note: As of visionOS 2, the Behaviors system is limited to predefined trigger-action pairs. For complex branching logic, handle it in Swift and use Reality Composer Pro only for the authored content (materials, animations, audio).

Integrating Reality Composer Pro Content in Swift

A Reality Composer Pro project compiles into a Swift package that Xcode adds to your project automatically. The generated package exposes your scenes and entities as typed references.

import SwiftUI
import RealityKit
import MonstersIncScene // ← Generated from your .rkassets bundle

struct ImmersiveScareFloor: View {
    var body: some View {
        RealityView { content in
            // Load the scene authored in Reality Composer Pro
            let scareFloor = try await Entity(
                named: "ScareFloor",
                in: monstersIncSceneBundle
            )
            content.add(scareFloor)

            // Access specific entities by name
            if let sulley = scareFloor.findEntity(named: "Sulley") {
                sulley.position = [0, 0, -2]
            }

            // Access and modify materials
            if let door = scareFloor
                .findEntity(named: "MonsterDoor"),
               var material = door
                   .components[ModelComponent.self]?
                   .materials.first as? ShaderGraphMaterial {
                try? material.setParameter(
                    name: "DoorColor",
                    value: .color(.init(.blue))
                )
                door.components[ModelComponent.self]?
                    .materials = [material]
            }
        }
    }
}

Project Organization

A well-structured Reality Composer Pro project mirrors your scene hierarchy:

MonstersIncScene.rkassets/
  ScareFloor.usda              -- Main scene file
  Materials/
    ScareFloorTile.usda        -- Shader Graph material
    MonsterDoor.usda
  ParticleSystems/
    ScreamCanister.usda
    DoorPortal.usda
  Audio/
    ambient_factory.m4a
    door_open.wav
  Animations/
    SulleyWalk.usda
    DoorOpen.usda

Tip: Keep each material, particle system, and animation in its own .usda file. This makes version control manageable and allows multiple team members to work on different assets without merge conflicts.

Performance Considerations

Shader complexity directly impacts frame time. On Apple Vision Pro, you have a strict 90 FPS requirement (11.1ms per frame budget, split between two eye renders). Every node in your Shader Graph adds GPU cost. Profile with the GPU Timeline in Instruments and aim for materials under 0.5ms per draw call.

Particle count vs. visual quality. The particle editor makes it easy to crank emission rates to 10,000/sec. On Vision Pro, keep total scene particle count under 50,000 alive at any time. Use the RealityKit Metrics overlay (RealityView.debugOptions: [.showStatistics]) to monitor live particle counts during development.

Texture memory. Shader Graph materials with multiple texture nodes can consume significant GPU memory. Use these guidelines:

Texture TypeResolutionFormat
Base Color / Albedo1024x1024Compressed (ASTC)
Normal Map1024x1024BC5/ASTC
Roughness / Metallic512x512Single channel, compressed
Emissive512x512Compressed (ASTC)

Audio source limits. visionOS supports up to 32 simultaneous spatial audio sources. Prioritize based on distance and relevance. Use audio pooling for effects that fire frequently (footsteps, UI sounds).

Animation blending. Playing multiple animations simultaneously on the same entity requires blend weights. Without explicit weights, animations fight each other and produce jittery results. Use transitionDuration on playAnimation to cross-fade between clips.

When to Use (and When Not To)

ScenarioRecommendation
Custom materials with visual iterationShader Graph is the right tool. Code-only is not practical for complex surfaces.
Procedural content (terrain, L-systems)Generate geometry in code, apply Shader Graph materials for final look.
Simple solid-color materialsSkip Reality Composer Pro. Use SimpleMaterial or UnlitMaterial in code.
Particle effects (fire, dust, sparks)Reality Composer Pro particle editor gives real-time preview. Strongly preferred.
Spatial audio placementConfigure in Reality Composer Pro for scene-bound sources. Use code for dynamic.
Complex animation sequencesUse the timeline for authored sequences. Use RealityKit APIs for procedural motion.
Team collaboration with 3D artistsEssential. Artists iterate on materials without touching Swift code.
Quick prototyping of spatial UIUseful for positioning and testing spatial layouts before implementing in SwiftUI.

Summary

  • Shader Graph is a node-based material editor that exports to ShaderGraphMaterial. Promote parameters to create a runtime API between your visual assets and Swift code.
  • Particle systems are configured visually and controlled via ParticleEmitterComponent. Keep emission rates reasonable for Vision Pro’s GPU constraints.
  • Spatial audio sources attach to entities and move with them automatically. Configure attenuation curves and directivity in Reality Composer Pro for physically plausible soundscapes.
  • Animation timelines sequence keyframed transformations and material changes. Trigger them from Swift via playAnimation or from Reality Composer Pro behaviors via notification triggers.
  • The compiled Reality Composer Pro project integrates as a Swift package with typed entity references, making asset loading type-safe and discoverable.

For a deeper look at the RealityKit entity-component architecture that powers these features, see RealityKit: AR and Spatial Computing. To explore ARKit’s scene understanding and hand tracking that pairs with Reality Composer Pro content, continue to ARKit: From Face Tracking to Room Plans.