Build a Podcast Player: Audio Playback, Background Audio, and Now Playing Integration
Picture this: your users lock their phones, slip them into their pockets, and the Pixar Story Time episode keeps playing — artwork on the lock screen, scrubber responding to their AirPods, skip buttons working through CarPlay. That seamless experience requires wiring together four distinct Apple frameworks, and getting even one wrong means silent failures that are nearly impossible to debug at 11 pm before a release.
In this tutorial, you’ll build Pixar Story Time — a fully functional podcast player that streams episodes from an
RSS feed, runs audio in the background, displays Now Playing info on the lock screen, and persists a mini-player overlay
across the entire navigation hierarchy. Along the way, you’ll learn how to parse XML feeds with XMLParser, manage
audio sessions with AVAudioSession, register remote control commands with MPRemoteCommandCenter, and compose a
persistent overlay UI with SwiftUI’s .overlay modifier.
Prerequisites
- Xcode 16+ with an iOS 18 deployment target
- Familiarity with networking and URLSession
- Familiarity with SwiftUI state management
- Familiarity with app lifecycle and scenes
Contents
- Getting Started
- Step 1: Parsing the RSS Feed
- Step 2: The Audio Player Manager
- Step 3: Building the Episode List View
- Step 4: Building the Player Controls
- Step 5: Configuring the Background Audio Session
- Step 6: Now Playing Info and Remote Controls
- Step 7: Adding Playback Speed Control
- Step 8: The Mini-Player Overlay
- Where to Go From Here?
Getting Started
Open Xcode and create a new project using the App template.
- Set the product name to PixarStoryTime.
- Ensure the interface is SwiftUI and the language is Swift.
- Set the minimum deployment target to iOS 18.0.
Adding Background Modes Capability
Background audio does not work without a capability declaration. Xcode will silently suspend your app the moment the user presses Home if this is missing.
- Select the PixarStoryTime target in the project navigator.
- Click the Signing & Capabilities tab.
- Click + Capability and add Background Modes.
- Check Audio, AirPlay, and Picture in Picture.
Xcode writes the corresponding UIBackgroundModes key into your Info.plist automatically. You can verify by opening
the Info.plist source and confirming audio appears in the array.
Project Structure
Create the following groups inside the PixarStoryTime folder. Right-click the folder in the Project Navigator and
choose New Group:
PixarStoryTime/
├── Models/
│ └── PodcastEpisode.swift
├── Networking/
│ └── RSSParser.swift
├── Audio/
│ └── AudioPlayerManager.swift
├── Views/
│ ├── EpisodeListView.swift
│ ├── PlayerView.swift
│ └── MiniPlayerView.swift
└── PixarStoryTimeApp.swift
This separation keeps concerns clean: networking and parsing live away from UI, and the audio manager is its own isolated layer that views observe without owning.
Step 1: Parsing the RSS Feed
Podcast RSS feeds are XML documents. Each <item> element represents an episode with a title, description, publication
date, audio enclosure URL, and artwork. Foundation’s
XMLParser is a SAX-style parser — it fires delegate
callbacks as it encounters elements rather than loading the entire document into memory, making it well-suited for
potentially large feed files.
Apple Docs:
XMLParser— Foundation
Defining the Data Model
Create Models/PodcastEpisode.swift and add the following:
import Foundation
struct PodcastEpisode: Identifiable, Sendable {
let id: UUID
let title: String
let description: String
let audioURL: URL
let duration: TimeInterval // seconds; parsed from iTunes duration tag
let pubDate: Date
let artworkURL: URL?
init(
id: UUID = UUID(),
title: String,
description: String,
audioURL: URL,
duration: TimeInterval,
pubDate: Date,
artworkURL: URL?
) {
self.id = id
self.title = title
self.description = description
self.audioURL = audioURL
self.duration = duration
self.pubDate = pubDate
self.artworkURL = artworkURL
}
}
extension PodcastEpisode {
/// Formats a raw TimeInterval into a human-readable string like "1h 23m" or "42m".
var formattedDuration: String {
let total = Int(duration)
let hours = total / 3600
let minutes = (total % 3600) / 60
if hours > 0 {
return "\(hours)h \(minutes)m"
}
return "\(minutes)m"
}
}
Sendable conformance is required because PodcastEpisode values will cross actor boundaries between the parsing task
and the @MainActor-isolated views. All stored properties are value types, so conformance is trivially safe.
Writing the RSS Parser
Create Networking/RSSParser.swift. The parser accumulates character data between tags into currentValue, and at the
end of each <item> it assembles a PodcastEpisode:
import Foundation
final class RSSParser: NSObject, XMLParserDelegate, @unchecked Sendable {
private var episodes: [PodcastEpisode] = []
// Scratch values accumulated during parsing of a single <item>
private var currentTitle = ""
private var currentDescription = ""
private var currentAudioURLString = ""
private var currentDuration: TimeInterval = 0
private var currentPubDateString = ""
private var currentArtworkURLString = ""
private var currentValue = ""
private var insideItem = false
func parse(data: Data) -> [PodcastEpisode] {
let parser = XMLParser(data: data)
parser.delegate = self
parser.parse()
return episodes
}
// MARK: - XMLParserDelegate
func parser(
_ parser: XMLParser,
didStartElement elementName: String,
namespaceURI: String?,
qualifiedName qName: String?,
attributes attributeDict: [String: String] = [:]
) {
currentValue = ""
switch elementName {
case "item":
insideItem = true
case "enclosure" where insideItem:
// Audio URL lives as an attribute, not character data
if let urlString = attributeDict["url"] {
currentAudioURLString = urlString
}
case "itunes:image" where insideItem:
if let href = attributeDict["href"] {
currentArtworkURLString = href
}
default:
break
}
}
func parser(
_ parser: XMLParser,
foundCharacters string: String
) {
currentValue += string
}
func parser(
_ parser: XMLParser,
didEndElement elementName: String,
namespaceURI: String?,
qualifiedName qName: String?
) {
guard insideItem else { return }
switch elementName {
case "title":
currentTitle = currentValue.trimmingCharacters(in: .whitespacesAndNewlines)
case "description":
currentDescription = currentValue.trimmingCharacters(in: .whitespacesAndNewlines)
case "itunes:duration":
currentDuration = Self.parseDuration(currentValue.trimmingCharacters(in: .whitespaces))
case "pubDate":
currentPubDateString = currentValue.trimmingCharacters(in: .whitespacesAndNewlines)
case "item":
flushCurrentItem()
insideItem = false
default:
break
}
}
// MARK: - Helpers
private func flushCurrentItem() {
guard
let audioURL = URL(string: currentAudioURLString),
!currentTitle.isEmpty
else { return }
let pubDate = Self.parseRFC822Date(currentPubDateString) ?? Date()
let artworkURL = URL(string: currentArtworkURLString)
let episode = PodcastEpisode(
title: currentTitle,
description: currentDescription,
audioURL: audioURL,
duration: currentDuration,
pubDate: pubDate,
artworkURL: artworkURL
)
episodes.append(episode)
// Reset scratch values for the next item
currentTitle = ""
currentDescription = ""
currentAudioURLString = ""
currentDuration = 0
currentPubDateString = ""
currentArtworkURLString = ""
}
/// Parses iTunes duration strings: "HH:MM:SS", "MM:SS", or raw seconds.
private static func parseDuration(_ raw: String) -> TimeInterval {
let components = raw.split(separator: ":").compactMap { Double($0) }
switch components.count {
case 3: return components[0] * 3600 + components[1] * 60 + components[2]
case 2: return components[0] * 60 + components[1]
case 1: return components[0]
default: return 0
}
}
private static func parseRFC822Date(_ raw: String) -> Date? {
let formatter = DateFormatter()
formatter.locale = Locale(identifier: "en_US_POSIX")
formatter.dateFormat = "EEE, dd MMM yyyy HH:mm:ss zzz"
return formatter.date(from: raw)
}
}
Fetching the Feed
For this tutorial, use the following mock Pixar-themed RSS URL constant. In a production app, this would come from your backend or a podcast directory API. Add a feed fetching function alongside the parser:
// Add to Networking/RSSParser.swift, outside the class
enum FeedError: Error {
case invalidURL
case networkFailure(Error)
case parsingFailure
}
func fetchEpisodes(from urlString: String) async throws -> [PodcastEpisode] {
guard let url = URL(string: urlString) else {
throw FeedError.invalidURL
}
let (data, _) = try await URLSession.shared.data(from: url)
let parser = RSSParser()
let episodes = parser.parse(data: data)
guard !episodes.isEmpty else {
throw FeedError.parsingFailure
}
return episodes
}
Because URLSession.data(from:) is already async, you get structured concurrency for free. The caller will await
this on a background task and publish results to @MainActor state.
For development and testing without a live server, add a mock feed constant at the top of RSSParser.swift:
let mockRSSXML = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd">
<channel>
<title>Pixar Story Time</title>
<item>
<title>Finding Nemo: The Deep Ocean Adventure</title>
<description>Join Marlin and Dory as they cross the ocean.</description>
<enclosure url="https://example.com/audio/finding-nemo-ep1.mp3"
type="audio/mpeg"/>
<itunes:duration>1:02:34</itunes:duration>
<pubDate>Mon, 01 Jul 2026 09:00:00 +0000</pubDate>
<itunes:image href="https://example.com/artwork/finding-nemo.jpg"/>
</item>
<item>
<title>WALL-E: A Love Story in Space</title>
<description>WALL-E compacts rubbish, finds a plant, and falls in love.</description>
<enclosure url="https://example.com/audio/wall-e-ep2.mp3"
type="audio/mpeg"/>
<itunes:duration>54:12</itunes:duration>
<pubDate>Mon, 08 Jul 2026 09:00:00 +0000</pubDate>
<itunes:image href="https://example.com/artwork/wall-e.jpg"/>
</item>
<item>
<title>Up: Carl and Russell's Grand Journey</title>
<description>An old widower ties balloons to his house and floats to Paradise Falls.</description>
<enclosure url="https://example.com/audio/up-ep3.mp3"
type="audio/mpeg"/>
<itunes:duration>47:55</itunes:duration>
<pubDate>Mon, 15 Jul 2026 09:00:00 +0000</pubDate>
<itunes:image href="https://example.com/artwork/up.jpg"/>
</item>
</channel>
</rss>
"""
You’ll use this mock data to load episodes without hitting a network during development. When you’re ready to wire in a
real feed, swap mockRSSXML.data(using: .utf8)! for URLSession.shared.data(from:).
Checkpoint: In a temporary test file or Swift Playground, instantiate
RSSParser, callparser.parse(data: data)withmockRSSXML.data(using: .utf8)!, and print the resulting array. You should see threePodcastEpisodevalues — “Finding Nemo: The Deep Ocean Adventure,” “WALL-E: A Love Story in Space,” and “Up: Carl and Russell’s Grand Journey” — each with a validaudioURLand parsed duration.
Step 2: The Audio Player Manager
All audio state lives in a single @Observable class marked @MainActor. This design means every property change
automatically triggers SwiftUI view updates without explicit @Published or DispatchQueue.main.async calls — a Swift
6 pattern that eliminates the entire class of “updating UI from background thread” crashes.
AVPlayer is AVFoundation’s general-purpose media
player. It wraps an AVPlayerItem, which
represents a single piece of media to play — in our case, a podcast audio stream. AVPlayer handles buffering, HTTP
Live Streaming, network interruptions, and audio route changes (like plugging in headphones mid-episode) automatically.
Your job is to configure the session correctly, set a rate, and observe state changes.
Apple Docs:
AVPlayer— AVFoundation
AVPlayerItem— AVFoundation
Create Audio/AudioPlayerManager.swift:
import AVFoundation
import Observation
import Foundation
@Observable
@MainActor
final class AudioPlayerManager {
// MARK: - Published State
private(set) var currentEpisode: PodcastEpisode?
private(set) var isPlaying = false
private(set) var currentTime: TimeInterval = 0
private(set) var duration: TimeInterval = 0
private(set) var playbackRate: Float = 1.0
// MARK: - Private
private var player: AVPlayer?
private var timeObserverToken: Any?
private var statusObservationTask: Task<Void, Never>?
// MARK: - Playback Control
func play(episode: PodcastEpisode) {
// If tapping the already-playing episode, just resume
if currentEpisode?.id == episode.id {
resume()
return
}
stopObservers()
let item = AVPlayerItem(url: episode.audioURL)
player = AVPlayer(playerItem: item)
currentEpisode = episode
currentTime = 0
duration = episode.duration // Use RSS duration as initial estimate
observeStatus(of: item)
startTimeObserver()
player?.rate = playbackRate
isPlaying = true
}
func resume() {
player?.rate = playbackRate
isPlaying = true
}
func pause() {
player?.pause()
isPlaying = false
}
func togglePlayPause() {
isPlaying ? pause() : resume()
}
func seek(to time: TimeInterval) {
let cmTime = CMTime(seconds: time, preferredTimescale: 600)
player?.seek(to: cmTime, toleranceBefore: .zero, toleranceAfter: .zero)
currentTime = time
}
func skip(by seconds: TimeInterval) {
let target = min(max(currentTime + seconds, 0), duration)
seek(to: target)
}
func setRate(_ rate: Float) {
playbackRate = rate
if isPlaying {
player?.rate = rate
}
}
// MARK: - Observers
private func startTimeObserver() {
let interval = CMTime(seconds: 0.5, preferredTimescale: 600)
timeObserverToken = player?.addPeriodicTimeObserver(
forInterval: interval,
queue: .main
) { [weak self] time in
guard let self else { return }
self.currentTime = time.seconds
}
}
private func observeStatus(of item: AVPlayerItem) {
statusObservationTask = Task { [weak self] in
for await status in item.publisher(for: \.status).values {
guard let self else { return }
if status == .readyToPlay,
item.duration.isNumeric {
self.duration = item.duration.seconds
}
}
}
}
private func stopObservers() {
if let token = timeObserverToken {
player?.removeTimeObserver(token)
timeObserverToken = nil
}
statusObservationTask?.cancel()
statusObservationTask = nil
}
}
A few design decisions worth noting:
player?.rate = playbackRateinstead ofplayer?.play()— settingratedirectly respects the user’s chosen speed from the first frame. Callingplay()always starts at 1x, requiring a second call to restore the rate.- Tolerance zero on
seek— passing.zerotolerances gives frame-accurate seeking at the cost of a slightly longer seek operation. For audio this is imperceptible and provides a better scrubber feel. AVPlayerItem.publisher(for: \.status)— this Combine-bridged KVO publisher is the modern way to observe item readiness in Swift 6 withoutNSKeyValueObservationretain cycles.
Checkpoint: Add
@State private var manager = AudioPlayerManager()to yourContentView, callmanager.play(episode: someEpisode)in.onAppear, and confirm you hear audio start playing when the app launches. Use one of the mock episode URLs for now.
Step 3: Building the Episode List View
The episode list is the home screen — a NavigationStack containing a scrollable list of episodes. Each row shows the
episode artwork, title, and formatted duration. The currently playing episode gets a visual highlight so listeners
always know what’s active.
Create Views/EpisodeListView.swift:
import SwiftUI
struct EpisodeListView: View {
let episodes: [PodcastEpisode]
let currentEpisode: PodcastEpisode?
let onSelect: (PodcastEpisode) -> Void
var body: some View {
List(episodes) { episode in
EpisodeRowView(
episode: episode,
isPlaying: currentEpisode?.id == episode.id
)
.contentShape(Rectangle())
.onTapGesture {
onSelect(episode)
}
.listRowBackground(
currentEpisode?.id == episode.id
? Color.accentColor.opacity(0.1)
: Color.clear
)
}
.listStyle(.plain)
.navigationTitle("Pixar Story Time")
}
}
The row view handles layout and the “now playing” indicator independently:
struct EpisodeRowView: View {
let episode: PodcastEpisode
let isPlaying: Bool
var body: some View {
HStack(spacing: 12) {
// Artwork thumbnail
AsyncImage(url: episode.artworkURL) { image in
image
.resizable()
.aspectRatio(contentMode: .fill)
} placeholder: {
RoundedRectangle(cornerRadius: 8)
.fill(Color.secondary.opacity(0.2))
.overlay {
Image(systemName: "film.fill")
.foregroundStyle(.secondary)
}
}
.frame(width: 64, height: 64)
.clipShape(RoundedRectangle(cornerRadius: 8))
VStack(alignment: .leading, spacing: 4) {
Text(episode.title)
.font(.body)
.fontWeight(isPlaying ? .semibold : .regular)
.lineLimit(2)
HStack(spacing: 6) {
if isPlaying {
// Animated waveform indicator for the playing episode
Image(systemName: "waveform")
.font(.caption)
.foregroundStyle(.accent)
.symbolEffect(.variableColor.iterative)
}
Text(episode.formattedDuration)
.font(.caption)
.foregroundStyle(.secondary)
}
}
Spacer()
}
.padding(.vertical, 4)
}
}
AsyncImage handles network image loading with automatic caching. The placeholder uses an SF Symbol so the layout
doesn’t jump when artwork loads. The symbolEffect(.variableColor.iterative) on the waveform icon produces an animated
equalizer effect on the currently playing row — a small detail that makes the UI feel polished.
Wiring It Into the App
Update PixarStoryTimeApp.swift to own the manager at the environment level:
import SwiftUI
@main
struct PixarStoryTimeApp: App {
@State private var audioManager = AudioPlayerManager()
var body: some Scene {
WindowGroup {
RootView()
.environment(audioManager)
}
}
}
Create Views/RootView.swift to hold the navigation stack and load episodes:
import SwiftUI
struct RootView: View {
@Environment(AudioPlayerManager.self) private var audioManager
@State private var episodes: [PodcastEpisode] = []
@State private var loadError: Error?
var body: some View {
NavigationStack {
EpisodeListView(
episodes: episodes,
currentEpisode: audioManager.currentEpisode,
onSelect: { episode in
audioManager.play(episode: episode)
}
)
}
.task {
loadEpisodes()
}
}
private func loadEpisodes() {
// Parse the mock RSS feed for development
guard let data = mockRSSXML.data(using: .utf8) else { return }
let parser = RSSParser()
episodes = parser.parse(data: data)
}
}
Note:
@Environment(AudioPlayerManager.self)works becauseAudioPlayerManageris@Observable. NoObservableObjector@EnvironmentObjectneeded — this is the Swift 5.9+ observation pattern.
Build and run to confirm the list is wired up correctly.
Checkpoint: You should see the Pixar Story Time navigation title and three episode rows — each showing a placeholder image (the real artwork URLs are mock), a title, and a formatted duration. Tap “Finding Nemo: The Deep Ocean Adventure” — it should highlight with the animated waveform indicator. If rows are missing, check that
loadEpisodes()is called from.taskinRootViewand thatmockRSSXMLis visible in scope.
Step 4: Building the Player Controls
The player view is a bottom sheet that expands from the mini-player. It contains the artwork, title, a scrubber, time labels, and transport controls.
Create Views/PlayerView.swift:
import SwiftUI
struct PlayerView: View {
@Environment(AudioPlayerManager.self) private var audioManager
@Binding var isPresented: Bool
var body: some View {
VStack(spacing: 24) {
// Drag indicator
Capsule()
.fill(Color.secondary.opacity(0.4))
.frame(width: 36, height: 4)
.padding(.top, 8)
// Artwork
AsyncImage(url: audioManager.currentEpisode?.artworkURL) { image in
image
.resizable()
.aspectRatio(contentMode: .fill)
} placeholder: {
RoundedRectangle(cornerRadius: 16)
.fill(Color.secondary.opacity(0.15))
.overlay {
Image(systemName: "film.fill")
.font(.system(size: 48))
.foregroundStyle(.secondary)
}
}
.frame(width: 260, height: 260)
.clipShape(RoundedRectangle(cornerRadius: 16))
.shadow(radius: 12)
// Title
VStack(spacing: 4) {
Text(audioManager.currentEpisode?.title ?? "No Episode Selected")
.font(.title3)
.fontWeight(.semibold)
.multilineTextAlignment(.center)
.lineLimit(2)
Text("Pixar Story Time")
.font(.subheadline)
.foregroundStyle(.secondary)
}
.padding(.horizontal, 24)
ScrubberView()
TransportControlsView()
Spacer()
}
.padding(.horizontal, 20)
}
}
Break the scrubber and controls into sub-views to keep each view focused:
struct ScrubberView: View {
@Environment(AudioPlayerManager.self) private var audioManager
@State private var isScrubbing = false
@State private var scrubTime: TimeInterval = 0
var displayTime: TimeInterval {
isScrubbing ? scrubTime : audioManager.currentTime
}
var body: some View {
VStack(spacing: 4) {
Slider(
value: Binding(
get: { displayTime },
set: { newValue in
isScrubbing = true
scrubTime = newValue
}
),
in: 0...max(audioManager.duration, 1)
)
.simultaneousGesture(
DragGesture(minimumDistance: 0)
.onEnded { _ in
audioManager.seek(to: scrubTime)
isScrubbing = false
}
)
HStack {
Text(formatTime(displayTime))
.font(.caption.monospacedDigit())
.foregroundStyle(.secondary)
Spacer()
Text("-\(formatTime(max(audioManager.duration - displayTime, 0)))")
.font(.caption.monospacedDigit())
.foregroundStyle(.secondary)
}
}
.padding(.horizontal, 4)
}
private func formatTime(_ seconds: TimeInterval) -> String {
let total = Int(max(seconds, 0))
let h = total / 3600
let m = (total % 3600) / 60
let s = total % 60
if h > 0 {
return String(format: "%d:%02d:%02d", h, m, s)
}
return String(format: "%d:%02d", m, s)
}
}
The isScrubbing flag prevents the live currentTime from snapping the thumb back while the user is dragging — a
subtle but important UX detail.
struct TransportControlsView: View {
@Environment(AudioPlayerManager.self) private var audioManager
var body: some View {
HStack(spacing: 40) {
// Skip back 15 seconds
Button {
audioManager.skip(by: -15)
} label: {
Image(systemName: "gobackward.15")
.font(.title)
.foregroundStyle(.primary)
}
// Play / Pause
Button {
audioManager.togglePlayPause()
} label: {
Image(systemName: audioManager.isPlaying ? "pause.circle.fill" : "play.circle.fill")
.font(.system(size: 64))
.foregroundStyle(.accent)
.contentTransition(.symbolEffect(.replace))
}
// Skip forward 15 seconds
Button {
audioManager.skip(by: 15)
} label: {
Image(systemName: "goforward.15")
.font(.title)
.foregroundStyle(.primary)
}
}
}
}
The .contentTransition(.symbolEffect(.replace)) on the play/pause button animates the SF Symbol swap without any
additional withAnimation call — the transition fires automatically when isPlaying changes.
Checkpoint: Present
PlayerViewas a sheet fromEpisodeListViewwhen the user taps an episode. Play an episode and verify the scrubber advances, the time labels update, and the skip buttons jump 15 seconds forward and back in the audio.
Step 5: Configuring the Background Audio Session
Without a properly configured AVAudioSession, iOS
pauses your audio the moment the app enters the background. iOS uses audio sessions to arbitrate between competing audio
requests from different apps — music players, video players, games, and voice assistants all declare what they need, and
the system decides who gets priority. The .playback category tells the system this app is a first-class audio citizen
— it should keep playing when the screen locks, another app’s sound should duck, and the audio route should persist
through headphone connections.
Think of it like WALL-E’s solar power panel: without explicitly opening it, WALL-E stays dormant no matter how much
sunlight there is. Without .playback, iOS keeps your audio closed.
Apple Docs:
AVAudioSession— AVFAudio
AVAudioSession.Category— AVFAudio
Add audio session setup to AudioPlayerManager. Insert this method and call it from init:
// Add inside AudioPlayerManager
init() {
configureAudioSession()
}
private func configureAudioSession() {
do {
try AVAudioSession.sharedInstance().setCategory(
.playback,
mode: .default,
options: [] // No mixing — take over audio focus like a podcast app
)
try AVAudioSession.sharedInstance().setActive(true)
} catch {
print("AudioPlayerManager: Failed to configure audio session: \(error)")
}
}
The .playback category with no options means your app will interrupt music from other apps (like Spotify) when
playback begins, which is exactly the behavior users expect from a podcast player. If you wanted to mix with background
music, you would add .mixWithOthers to the options set — but podcast apps typically should not do this.
Warning:
AVAudioSessionconfiguration must happen before anyAVPlayerstarts playing. If you callsetCategoryafter playback begins, the system may not apply the new category until the next audio route change.
Testing Background Playback
- Build and run on a physical device (the Simulator does not support background audio).
- Start playing an episode.
- Press the Home button (or swipe up on Face ID devices).
- The audio should continue uninterrupted.
- Play an episode, then lock the screen — audio should keep playing.
If audio stops when you background the app, double-check that Audio, AirPlay, and Picture in Picture is checked
under Background Modes in Signing & Capabilities, and that the Info.plist contains audio in UIBackgroundModes.
Step 6: Now Playing Info and Remote Controls
Background audio playing silently with no lock screen UI feels broken to users — like watching a Pixar film with the screen blacked out. The MediaPlayer framework provides two complementary classes to fix this:
MPNowPlayingInfoCenterpopulates the lock screen media card, Control Center widget, and CarPlay screen with the episode title, artist name, artwork, elapsed time, and duration. You write a dictionary of well-known keys and the system handles rendering.MPRemoteCommandCenterregisters handlers for hardware control events — the play/pause button on AirPods, the skip buttons in CarPlay, the scrubber on the lock screen — so those physical interactions call into yourAudioPlayerManager.
The two classes work together: MPNowPlayingInfoCenter tells the system what is playing, MPRemoteCommandCenter
tells the system what to do when the user interacts.
Apple Docs:
MPNowPlayingInfoCenter— MediaPlayer
MPRemoteCommandCenter— MediaPlayer
Add Now Playing support to AudioPlayerManager. First, add the import at the top of the file:
import MediaPlayer
Then add two new methods:
// Add inside AudioPlayerManager
func updateNowPlayingInfo() {
guard let episode = currentEpisode else {
MPNowPlayingInfoCenter.default().nowPlayingInfo = nil
return
}
var info: [String: Any] = [
MPMediaItemPropertyTitle: episode.title,
MPMediaItemPropertyArtist: "Pixar Story Time",
MPNowPlayingInfoPropertyElapsedPlaybackTime: currentTime,
MPMediaItemPropertyPlaybackDuration: duration,
MPNowPlayingInfoPropertyPlaybackRate: isPlaying ? Double(playbackRate) : 0.0,
MPNowPlayingInfoPropertyDefaultPlaybackRate: 1.0,
]
// Load artwork asynchronously and update again once loaded
if let artworkURL = episode.artworkURL {
Task {
if let (data, _) = try? await URLSession.shared.data(from: artworkURL),
let uiImage = UIImage(data: data) {
let artwork = MPMediaItemArtwork(boundsSize: uiImage.size) { _ in uiImage }
var updatedInfo = MPNowPlayingInfoCenter.default().nowPlayingInfo ?? [:]
updatedInfo[MPMediaItemPropertyArtwork] = artwork
MPNowPlayingInfoCenter.default().nowPlayingInfo = updatedInfo
}
}
}
MPNowPlayingInfoCenter.default().nowPlayingInfo = info
}
func setupRemoteCommands() {
let commandCenter = MPRemoteCommandCenter.shared()
commandCenter.playCommand.addTarget { [weak self] _ in
self?.resume()
self?.updateNowPlayingInfo()
return .success
}
commandCenter.pauseCommand.addTarget { [weak self] _ in
self?.pause()
self?.updateNowPlayingInfo()
return .success
}
commandCenter.skipForwardCommand.preferredIntervals = [15]
commandCenter.skipForwardCommand.addTarget { [weak self] event in
guard let self,
let skipEvent = event as? MPSkipIntervalCommandEvent else {
return .commandFailed
}
self.skip(by: skipEvent.interval)
self.updateNowPlayingInfo()
return .success
}
commandCenter.skipBackwardCommand.preferredIntervals = [15]
commandCenter.skipBackwardCommand.addTarget { [weak self] event in
guard let self,
let skipEvent = event as? MPSkipIntervalCommandEvent else {
return .commandFailed
}
self.skip(by: -skipEvent.interval)
self.updateNowPlayingInfo()
return .success
}
commandCenter.changePlaybackPositionCommand.addTarget { [weak self] event in
guard let self,
let positionEvent = event as? MPChangePlaybackPositionCommandEvent else {
return .commandFailed
}
self.seek(to: positionEvent.positionTime)
self.updateNowPlayingInfo()
return .success
}
}
Call both setup methods from init:
init() {
configureAudioSession()
setupRemoteCommands()
}
Call updateNowPlayingInfo() whenever playback state changes. Update play(), resume(), pause(), and seek(to:):
func play(episode: PodcastEpisode) {
// ... existing implementation ...
updateNowPlayingInfo() // ← Add at the end
}
func resume() {
player?.rate = playbackRate
isPlaying = true
updateNowPlayingInfo() // ← Add
}
func pause() {
player?.pause()
isPlaying = false
updateNowPlayingInfo() // ← Add
}
func seek(to time: TimeInterval) {
let cmTime = CMTime(seconds: time, preferredTimescale: 600)
player?.seek(to: cmTime, toleranceBefore: .zero, toleranceAfter: .zero)
currentTime = time
updateNowPlayingInfo() // ← Add
}
Also update updateNowPlayingInfo() in the periodic time observer so the elapsed time stays current on the lock screen:
private func startTimeObserver() {
let interval = CMTime(seconds: 0.5, preferredTimescale: 600)
timeObserverToken = player?.addPeriodicTimeObserver(
forInterval: interval,
queue: .main
) { [weak self] time in
guard let self else { return }
self.currentTime = time.seconds
self.updateNowPlayingInfo() // ← Keep lock screen in sync
}
}
Checkpoint: Build and run on a real device. Start playing an episode, then lock the screen. You should see the episode title “Finding Nemo: The Deep Ocean Adventure,” the artist “Pixar Story Time,” and working play/pause, 15-second skip, and scrubber controls on the lock screen. Test the AirPods controls if available.
Step 7: Adding Playback Speed Control
Podcast listeners commonly speed up playback to 1.5x or 2x. AVPlayer.rate accepts any Float value — setting it to
1.5 plays audio at 1.5x speed while maintaining pitch (iOS applies pitch correction automatically via AVPlayer’s
default time pitch algorithm).
Add a SpeedPickerView component. Create it inline inside PlayerView.swift:
struct SpeedPickerView: View {
@Environment(AudioPlayerManager.self) private var audioManager
private let speeds: [Float] = [0.5, 1.0, 1.5, 2.0]
var body: some View {
Menu {
ForEach(speeds, id: \.self) { speed in
Button {
audioManager.setRate(speed)
} label: {
HStack {
Text(formattedSpeed(speed))
if audioManager.playbackRate == speed {
Image(systemName: "checkmark")
}
}
}
}
} label: {
Text(formattedSpeed(audioManager.playbackRate))
.font(.callout.weight(.semibold))
.foregroundStyle(.accent)
.padding(.horizontal, 12)
.padding(.vertical, 6)
.background(Color.accentColor.opacity(0.1), in: Capsule())
}
}
private func formattedSpeed(_ speed: Float) -> String {
speed == 1.0 ? "1x" : String(format: "%.1fx", speed)
}
}
Add SpeedPickerView() to PlayerView’s VStack, between the TransportControlsView and the Spacer:
// Inside PlayerView body VStack
TransportControlsView()
SpeedPickerView() // ← Add this line
Spacer()
setRate(_:) in AudioPlayerManager already handles updating MPNowPlayingInfoCenter with the new playback rate. The
lock screen scrubber will automatically advance at the correct rate because MPNowPlayingInfoPropertyPlaybackRate is
set accurately on each updateNowPlayingInfo() call.
Tip:
AVPlayerpreserves audio pitch when you change the rate. This behavior comes from the defaultAVAudioTimePitchAlgorithm.timeDomainalgorithm applied to theAVPlayerItem. You don’t need to configure anything extra — it just works.
Now build and run to verify speed control end to end.
Checkpoint: Open the full player sheet and tap the speed menu. Select 1.5x — the episode should immediately speed up and the speed button should update to “1.5x.” Lock the screen and verify the lock screen scrubber advances at the faster rate. Select 1x to return to normal speed before moving on.
Step 8: The Mini-Player Overlay
The mini-player is a thin bar pinned to the bottom of every screen in the app. It shows the episode artwork thumbnail,
the title truncated to one line, and a play/pause button. Tapping it anywhere expands to the full PlayerView sheet.
Create Views/MiniPlayerView.swift:
import SwiftUI
struct MiniPlayerView: View {
@Environment(AudioPlayerManager.self) private var audioManager
@Binding var isPlayerPresented: Bool
var body: some View {
HStack(spacing: 12) {
// Artwork thumbnail
AsyncImage(url: audioManager.currentEpisode?.artworkURL) { image in
image
.resizable()
.aspectRatio(contentMode: .fill)
} placeholder: {
RoundedRectangle(cornerRadius: 6)
.fill(Color.secondary.opacity(0.2))
}
.frame(width: 44, height: 44)
.clipShape(RoundedRectangle(cornerRadius: 6))
// Title
Text(audioManager.currentEpisode?.title ?? "")
.font(.subheadline)
.fontWeight(.medium)
.lineLimit(1)
.frame(maxWidth: .infinity, alignment: .leading)
// Play / Pause button
Button {
audioManager.togglePlayPause()
} label: {
Image(systemName: audioManager.isPlaying ? "pause.fill" : "play.fill")
.font(.title3)
.foregroundStyle(.primary)
.frame(width: 44, height: 44)
}
.buttonStyle(.plain)
}
.padding(.horizontal, 16)
.padding(.vertical, 10)
.background(.regularMaterial)
.clipShape(RoundedRectangle(cornerRadius: 14))
.shadow(color: .black.opacity(0.12), radius: 8, y: 4)
.onTapGesture {
isPlayerPresented = true
}
.padding(.horizontal, 12)
.padding(.bottom, 8)
}
}
.regularMaterial gives the mini-player a frosted glass appearance that adapts correctly to both light and dark mode.
Attaching the Overlay to the Navigation Root
Update RootView.swift to own the player presentation state and overlay the mini-player:
import SwiftUI
struct RootView: View {
@Environment(AudioPlayerManager.self) private var audioManager
@State private var episodes: [PodcastEpisode] = []
@State private var isPlayerPresented = false // ← New
var body: some View {
NavigationStack {
EpisodeListView(
episodes: episodes,
currentEpisode: audioManager.currentEpisode,
onSelect: { episode in
audioManager.play(episode: episode)
}
)
}
// Mini-player overlay pinned to the bottom of the entire navigation hierarchy
.overlay(alignment: .bottom) {
if audioManager.currentEpisode != nil {
MiniPlayerView(isPlayerPresented: $isPlayerPresented)
.transition(.move(edge: .bottom).combined(with: .opacity))
}
}
.animation(.spring(duration: 0.35), value: audioManager.currentEpisode?.id)
// Full player as a sheet
.sheet(isPresented: $isPlayerPresented) {
PlayerView(isPresented: $isPlayerPresented)
.presentationDetents([.large])
.presentationDragIndicator(.hidden) // We draw our own
}
.task {
loadEpisodes()
}
}
private func loadEpisodes() {
guard let data = mockRSSXML.data(using: .utf8) else { return }
let parser = RSSParser()
episodes = parser.parse(data: data)
}
}
The key architectural choice here is placing .overlay(alignment: .bottom) on the NavigationStack rather than inside
any individual view. This means the mini-player renders above all navigation destinations — if you push a detail view or
present another sheet, the mini-player persists. It only disappears when currentEpisode becomes nil, which is
animated with a spring transition.
A common mistake is placing the overlay inside EpisodeListView instead. That approach breaks the moment you push a
detail screen — SwiftUI re-renders the list view behind the navigation destination and the overlay disappears. Keeping
the overlay at the NavigationStack level means it lives in the view hierarchy above all navigation destinations, just
like how Pixar’s score keeps playing whether you’re watching Buzz or Woody on screen.
Avoiding the Safe Area Overlap
The mini-player sits above the home indicator on newer iPhones. The NavigationStack’s content will scroll behind the
mini-player and be obscured unless you account for it. The cleanest solution is a
safeAreaInset
modifier on the list:
// Inside EpisodeListView body, wrap the List:
List(episodes) { episode in
// ... row content
}
.listStyle(.plain)
.navigationTitle("Pixar Story Time")
.safeAreaInset(edge: .bottom) {
// Reserve space equal to the mini-player height when one is active
Color.clear.frame(height: 80)
}
safeAreaInset insets the scrollable content without affecting the background, so the list rows are never obscured by
the mini-player. Unlike adding bottom padding directly to the list, this approach also works correctly when the
mini-player is not visible — when currentEpisode is nil, the spacer height has no effect on layout.
Checkpoint: Build and run. Tap any episode in the list. The mini-player should animate up from the bottom. Tap a second episode — the mini-player updates without dismissing. Tap the mini-player to open the full player sheet. Dismiss the sheet — the mini-player remains. Navigate away and back — the mini-player persists the entire time.
Animating the Play/Pause Button
One final polish detail: the play/pause button in TransportControlsView uses
.contentTransition(.symbolEffect(.replace)), but the mini-player uses a different SF Symbol (pause.fill vs
pause.circle.fill). Make them consistent by adding the same content transition to MiniPlayerView:
// Inside MiniPlayerView, update the Button label:
Image(systemName: audioManager.isPlaying ? "pause.fill" : "play.fill")
.font(.title3)
.foregroundStyle(.primary)
.frame(width: 44, height: 44)
.contentTransition(.symbolEffect(.replace)) // ← Add
Both buttons now animate their SF Symbol transitions, making the UI feel cohesive.
Where to Go From Here?
Congratulations! You’ve built Pixar Story Time — a full-featured podcast player with streaming audio, a scrollable episode list, background audio that survives screen locks and Home button presses, lock screen Now Playing integration with fully working remote controls, playback speed selection, and a persistent mini-player overlay that lives above your entire navigation hierarchy.
Here’s what you learned:
- Parsing podcast RSS feeds with
XMLParser’s SAX-style delegate callbacks - Centralizing all audio state in a
@Observable @MainActorclass backed byAVPlayer - Configuring
AVAudioSessionwith the.playbackcategory to enable background audio - Populating the lock screen media widget with
MPNowPlayingInfoCenterand keeping it in sync with a periodic time observer - Registering play, pause, skip, and scrubber commands with
MPRemoteCommandCenterfor AirPods and CarPlay support - Controlling playback speed through
AVPlayer.ratewith automatic pitch correction - Building a persistent mini-player overlay using
.overlay(alignment: .bottom)on aNavigationStack
Ideas for extending this project:
- Offline downloads — Use
URLSession.downloadTaskto save episodes to the app’s Caches directory and play from local files when offline. Track download progress withURLSessionDownloadDelegate. - Sleep timer — Add a
Task.sleepcountdown that callspause()after a user-specified interval. Present the remaining time in the mini-player. - CarPlay integration — Adopt the
CPNowPlayingTemplatefrom the CarPlay framework to give Pixar Story Time a first-class in-car experience. - Chapter markers — Parse
<psc:chapter>elements from the RSS feed and display a chapter list in the player view. Jump to a chapter withseek(to:). - Playback history — Use SwiftData to persist
lastPlayedTimeper episode so listeners can resume exactly where they left off.
Related Posts
- The Observation Framework — Understand how
@Observableand@Environmentwork under the hood, and when to reach for them overObservableObject. - SwiftUI Animations — Go deeper on
symbolEffect,contentTransition, spring animations, and the newAnimationAPI used throughout this tutorial.