Build a Photo Gallery App: Camera Capture, PhotosPicker, and Grid Display
Every photo Boo ever took of Kitty deserved a proper home. By the end of this tutorial, you’ll have built one — a full-featured photo gallery app where users can shoot new photos with the camera, pull existing ones from their library, browse them in a smooth adaptive grid, zoom in with a pinch, and even apply nostalgic film filters. All metadata persists across launches thanks to SwiftData.
In this tutorial, you’ll build Boo’s Monster World Photo Album — an Instagram-style gallery app inspired by
Monsters, Inc. Along the way, you’ll learn how to integrate
PhotosPicker, wrap a UIImagePickerController for
live camera capture, build a LazyVGrid thumbnail browser, implement pinch-to-zoom with
MagnifyGesture, and apply
CIFilter effects.
Prerequisites
- Xcode 16+ with an iOS 18 deployment target
- Familiarity with SwiftUI images and media
- Familiarity with grids and lazy layouts
- Familiarity with SwiftData fundamentals
- A physical device is strongly recommended for camera testing (the Simulator supports PhotosPicker but not the camera)
Contents
- Getting Started
- Step 1: Defining the Photo Data Model
- Step 2: Integrating PhotosPicker
- Step 3: Building the Photo Grid
- Step 4: Adding Camera Capture
- Step 5: Building the Full-Screen Photo Viewer
- Step 6: Applying Core Image Filters
- Step 7: Editing Photo Details and Captions
- Where to Go From Here?
Getting Started
Open Xcode and create a new project using the App template.
- Set the product name to BooPhotoAlbum.
- Ensure the interface is SwiftUI and the language is Swift.
- Set the deployment target to iOS 18.0.
- Check SwiftData is available — no additional packages are needed for this project.
Configuring Info.plist Permissions
The app needs two privacy descriptions before iOS will present its permission dialogs. Without them, the app will crash the moment you request camera or photo library access.
Open Info.plist (or the target’s Info tab in Xcode’s project settings) and add these two keys:
| Key | Value |
|---|---|
NSCameraUsageDescription | Boo wants to photograph all the monsters she finds! |
NSPhotoLibraryUsageDescription | Boo needs access to her photo album. |
If you’re editing the raw Info.plist XML directly, the entries look like this:
<key>NSCameraUsageDescription</key>
<string>Boo wants to photograph all the monsters she finds!</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>Boo needs access to her photo album.</string>
Note: Without these keys, iOS will throw a runtime exception when your code requests camera or photo library permissions. The App Store also rejects apps that request permissions without a usage description.
Project Structure
Create the following Swift files in your project. You’ll fill each one in over the course of the tutorial:
BooPhotoAlbum/
├── BooPhotoAlbumApp.swift (already exists)
├── ContentView.swift (already exists)
├── Models/
│ └── GalleryPhoto.swift
├── Views/
│ ├── PhotoGridView.swift
│ ├── PhotoDetailView.swift
│ ├── CameraPickerView.swift
│ └── FilterPickerView.swift
└── Helpers/
└── ImageFilterService.swift
Step 1: Defining the Photo Data Model
Every photo in Boo’s album needs a persistent home. SwiftData gives us a type-safe, lightweight persistence layer that replaces the boilerplate of Core Data while feeling natural in SwiftUI.
Create Models/GalleryPhoto.swift and define the model:
import Foundation
import SwiftData
@Model
final class GalleryPhoto {
var id: UUID
var imageData: Data // Raw JPEG bytes for the (possibly filtered) photo
var thumbnailData: Data // Downscaled JPEG for grid performance
var caption: String
var dateAdded: Date
var appliedFilter: PhotoFilter
init(
imageData: Data,
thumbnailData: Data,
caption: String = "",
dateAdded: Date = .now,
appliedFilter: PhotoFilter = .none
) {
self.id = UUID()
self.imageData = imageData
self.thumbnailData = thumbnailData
self.caption = caption
self.dateAdded = dateAdded
self.appliedFilter = appliedFilter
}
}
We store both a full-resolution imageData blob and a downscaled thumbnailData blob. That separation is important:
the grid shows thumbnails so it can load dozens of cells without materializing full-resolution images in memory all at
once. The full image is only decoded when the user opens the detail view.
Now define the PhotoFilter enum in the same file. Because it’s a raw-value enum backed by String, SwiftData can
persist it as a plain string column without any custom coding:
enum PhotoFilter: String, CaseIterable, Codable {
case none = "None"
case sepia = "Sepia"
case noir = "Noir"
case vivid = "Vivid"
var displayName: String { rawValue }
}
Finally, wire SwiftData into the app entry point. Open BooPhotoAlbumApp.swift and update it:
import SwiftUI
import SwiftData
@main
struct BooPhotoAlbumApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
.modelContainer(for: GalleryPhoto.self)
}
}
The .modelContainer(for:) modifier creates the SQLite backing store on first launch and injects a ModelContext into
the SwiftUI environment, making it available to every view in the hierarchy.
Checkpoint: Build the project. There’s nothing to see yet, but it should compile without errors. If you see a
SwiftDataimport error, make sure your deployment target is iOS 17 or later and that SwiftData is linked in the target’s frameworks.
Step 2: Integrating PhotosPicker
PhotosPicker is a SwiftUI view that presents the
system photo library picker — the same one used by Messages, Notes, and every other first-party app. It handles all
permissions UI automatically, which means you don’t need to call PHPhotoLibrary.requestAuthorization yourself when
using it.
Create Views/PhotoGridView.swift. This will be the main view of the app, and we’ll build it incrementally. Start with
the PhotosPicker integration:
import SwiftUI
import SwiftData
import PhotosUI
struct PhotoGridView: View {
@Environment(\.modelContext) private var modelContext
@Query(sort: \GalleryPhoto.dateAdded, order: .reverse) private var photos: [GalleryPhoto]
// PhotosPickerItem is the lightweight token returned by the picker.
// We hold an array because we allow multi-select.
@State private var selectedItems: [PhotosPickerItem] = []
@State private var isImporting = false
var body: some View {
NavigationStack {
Text("Boo's Album") // Placeholder — replaced in Step 3
.toolbar {
ToolbarItem(placement: .topBarTrailing) {
PhotosPicker(
selection: $selectedItems,
maxSelectionCount: 10,
matching: .images
) {
Label("Add Photos", systemImage: "photo.badge.plus")
}
.onChange(of: selectedItems) { _, newItems in
Task { await importPhotos(newItems) }
}
}
}
}
}
}
PhotosPickerItem is a lightweight token — it
doesn’t carry image data itself. To get the raw bytes, you call loadTransferable(type:), which performs the actual
asset fetch asynchronously. That’s why we dispatch into a Task inside onChange.
Add the import function below the body:
extension PhotoGridView {
private func importPhotos(_ items: [PhotosPickerItem]) async {
for item in items {
// loadTransferable fetches the asset data from the photo library.
// Requesting Data gives us raw JPEG/HEIC bytes with no format conversion.
guard let data = try? await item.loadTransferable(type: Data.self) else { continue }
guard let uiImage = UIImage(data: data) else { continue }
let thumbnail = makeThumbnail(from: uiImage, size: CGSize(width: 240, height: 240))
guard let thumbData = thumbnail.jpegData(compressionQuality: 0.7) else { continue }
guard let fullData = uiImage.jpegData(compressionQuality: 0.9) else { continue }
let photo = GalleryPhoto(imageData: fullData, thumbnailData: thumbData)
modelContext.insert(photo)
}
// Clear the selection so the user can pick again immediately.
selectedItems = []
}
private func makeThumbnail(from image: UIImage, size: CGSize) -> UIImage {
let renderer = UIGraphicsImageRenderer(size: size)
return renderer.image { _ in
image.draw(in: CGRect(origin: .zero, size: size))
}
}
}
The makeThumbnail(from:size:) helper uses UIGraphicsImageRenderer to downscale the full image synchronously. For a
production app you’d want to run this on a background actor to avoid blocking the main thread when importing many photos
at once — we’ll keep it simple here to focus on the Photos integration itself.
Now update ContentView.swift to show PhotoGridView:
import SwiftUI
struct ContentView: View {
var body: some View {
PhotoGridView()
}
}
Checkpoint: Build and run on device or Simulator. Tap the Add Photos button in the top-right corner. The system photo library picker should appear. Select a few images, dismiss the picker, and confirm the app doesn’t crash. The
photosarray inPhotoGridViewis now being populated — we just haven’t rendered the grid yet.
Step 3: Building the Photo Grid
A flat List would work, but a LazyVGrid with an adaptive column layout gives us a natural gallery feel: iOS
calculates how many columns fit the current screen width, which means it works correctly on iPhone, iPad, and in Split
View without any extra code.
Replace the Text("Boo's Album") placeholder in PhotoGridView with the full grid implementation:
// Inside PhotoGridView.body, replace the Text placeholder:
private let columns = [
GridItem(.adaptive(minimum: 120), spacing: 2)
]
var body: some View {
NavigationStack {
ScrollView {
LazyVGrid(columns: columns, spacing: 2) {
ForEach(photos) { photo in
NavigationLink(value: photo) {
ThumbnailCell(photo: photo)
}
}
}
.padding(.horizontal, 2)
}
.navigationTitle("Boo's Album")
.navigationDestination(for: GalleryPhoto.self) { photo in
PhotoDetailView(photo: photo)
}
.toolbar {
// ... toolbar items from Step 2
}
}
}
Now create a dedicated ThumbnailCell view that decodes the thumbnail data and draws it as a square cell:
struct ThumbnailCell: View {
let photo: GalleryPhoto
var body: some View {
if let uiImage = UIImage(data: photo.thumbnailData) {
Image(uiImage: uiImage)
.resizable()
.scaledToFill()
.frame(minWidth: 0, maxWidth: .infinity, minHeight: 120)
.clipped()
.contentShape(Rectangle())
} else {
Rectangle()
.fill(Color.secondary.opacity(0.2))
.frame(height: 120)
.overlay {
Image(systemName: "photo")
.foregroundStyle(.secondary)
}
}
}
}
The .scaledToFill() + .clipped() combination is the standard recipe for square image crops. scaledToFill scales
the image up until both dimensions meet or exceed the frame, and clipped cuts off the overflow. Without clipped, the
image would bleed outside its cell and overlap neighbors.
Note:
LazyVGridwith.adaptive(minimum: 120)creates cells that are at least 120 pt wide. On a 390 pt iPhone, you’ll get 3 columns (3 × 120 = 360, with 3 × 2 = 6 pt of spacing). On an iPad, you’ll automatically get more columns — no breakpoints needed.Checkpoint: Build and run. Import a few photos using the PhotosPicker button from Step 2. You should now see them arranged in a tight square-cell grid resembling Instagram’s profile view. Tapping a cell will crash because
PhotoDetailViewdoesn’t exist yet — you’ll build that in Step 5. For now, wrap theNavigationLinkin a placeholder button to avoid the crash, or simply don’t tap cells yet.
Step 4: Adding Camera Capture
UIImagePickerController remains the
straightforward way to present the camera in a SwiftUI app. While AVFoundation gives you a fully custom camera UI,
UIImagePickerController covers the common case — a sheet that lets the user frame, shoot, and confirm a photo — in far
fewer lines. Wrap it in a UIViewControllerRepresentable to use it from SwiftUI.
Create Views/CameraPickerView.swift:
import SwiftUI
import UIKit
struct CameraPickerView: UIViewControllerRepresentable {
// Called with the captured UIImage when the user taps "Use Photo".
var onCapture: (UIImage) -> Void
@Environment(\.dismiss) private var dismiss
func makeUIViewController(context: Context) -> UIImagePickerController {
let picker = UIImagePickerController()
picker.sourceType = .camera // Use the live camera feed
picker.allowsEditing = false // Boo can edit in the app instead
picker.delegate = context.coordinator
return picker
}
func updateUIViewController(_ uiViewController: UIImagePickerController, context: Context) {}
func makeCoordinator() -> Coordinator {
Coordinator(parent: self)
}
final class Coordinator: NSObject,
UIImagePickerControllerDelegate,
UINavigationControllerDelegate {
let parent: CameraPickerView
init(parent: CameraPickerView) {
self.parent = parent
}
func imagePickerController(
_ picker: UIImagePickerController,
didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey: Any]
) {
if let image = info[.originalImage] as? UIImage {
parent.onCapture(image)
}
parent.dismiss()
}
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
parent.dismiss()
}
}
}
Now add a camera button to PhotoGridView’s toolbar and a sheet presentation to show it. Add these state properties to
PhotoGridView:
@State private var showingCamera = false
Then add a second ToolbarItem alongside the PhotosPicker button:
ToolbarItem(placement: .topBarLeading) {
Button {
showingCamera = true
} label: {
Label("Camera", systemImage: "camera")
}
// Disable the camera button when running in Simulator
.disabled(!UIImagePickerController.isSourceTypeAvailable(.camera))
}
Add the sheet modifier to the NavigationStack:
.sheet(isPresented: $showingCamera) {
CameraPickerView { capturedImage in
Task { await saveCapture(capturedImage) }
}
.ignoresSafeArea()
}
And add the save helper to the PhotoGridView extension:
private func saveCapture(_ image: UIImage) async {
let thumbnail = makeThumbnail(from: image, size: CGSize(width: 240, height: 240))
guard let thumbData = thumbnail.jpegData(compressionQuality: 0.7),
let fullData = image.jpegData(compressionQuality: 0.9) else { return }
let photo = GalleryPhoto(imageData: fullData, thumbnailData: thumbData)
modelContext.insert(photo)
}
Warning:
UIImagePickerControllerwithsourceType: .camerais not available in the Simulator. The.disabled(!UIImagePickerController.isSourceTypeAvailable(.camera))modifier prevents a crash by graying out the button when no camera is available. Always test camera functionality on a physical device.Checkpoint: Build and run on a physical device. Tap the camera button in the top-left corner. The system camera UI should appear. Take a photo, tap Use Photo, and confirm that the new photo appears in the grid. If the camera button is grayed out on device, verify that the
NSCameraUsageDescriptionkey exists inInfo.plistand that you’ve granted camera permission in Settings.
Step 5: Building the Full-Screen Photo Viewer
When Boo taps a photo, she should be able to see it at full resolution and swipe to browse neighboring photos. A
TabView with .tabViewStyle(.page) handles the swipe-between-photos mechanic with essentially no additional code,
because TabView provides horizontal paging natively.
Create Views/PhotoDetailView.swift:
import SwiftUI
import SwiftData
struct PhotoDetailView: View {
let photo: GalleryPhoto
@Environment(\.modelContext) private var modelContext
@Query(sort: \GalleryPhoto.dateAdded, order: .reverse) private var allPhotos: [GalleryPhoto]
@State private var currentIndex: Int = 0
@State private var showingEditSheet = false
var body: some View {
TabView(selection: $currentIndex) {
ForEach(Array(allPhotos.enumerated()), id: \.offset) { index, p in
ZoomablePhoto(photo: p)
.tag(index)
}
}
.tabViewStyle(.page(indexDisplayMode: .never))
.background(Color.black)
.ignoresSafeArea()
.navigationBarTitleDisplayMode(.inline)
.toolbar {
ToolbarItem(placement: .topBarTrailing) {
Button {
showingEditSheet = true
} label: {
Image(systemName: "pencil")
}
}
}
.sheet(isPresented: $showingEditSheet) {
if let current = allPhotos[safe: currentIndex] {
PhotoEditSheet(photo: current)
}
}
.onAppear {
// Scroll TabView to the tapped photo on first appear.
if let index = allPhotos.firstIndex(where: { $0.id == photo.id }) {
currentIndex = index
}
}
}
}
The allPhotos @Query fetches all photos in the same sort order as the grid so the swipe sequence matches what the
user sees. onAppear sets currentIndex to the photo that was tapped, so the TabView opens on the right page.
Note: The
[safe:]subscript is not part of the Swift standard library. Add this small extension to your project — it safely returnsnilinstead of crashing on out-of-bounds access:extension Array { subscript(safe index: Index) -> Element? { indices.contains(index) ? self[index] : nil } }
Now create the ZoomablePhoto subview that implements pinch-to-zoom using
MagnifyGesture:
struct ZoomablePhoto: View {
let photo: GalleryPhoto
@State private var scale: CGFloat = 1.0
@State private var lastScale: CGFloat = 1.0
@State private var offset: CGSize = .zero
@State private var lastOffset: CGSize = .zero
var body: some View {
if let uiImage = UIImage(data: photo.imageData) {
Image(uiImage: uiImage)
.resizable()
.scaledToFit()
.scaleEffect(scale)
.offset(offset)
.gesture(
MagnifyGesture()
.onChanged { value in
// value.magnification is the cumulative magnification factor.
// We multiply by lastScale so zooming in, releasing,
// then zooming in again doesn't reset the zoom level.
scale = lastScale * value.magnification
}
.onEnded { value in
// Clamp to a sensible range and snap back to 1× if below it.
scale = max(1.0, min(scale, 5.0))
lastScale = scale
if scale == 1.0 { offset = .zero; lastOffset = .zero }
}
)
.simultaneousGesture(
DragGesture()
.onChanged { value in
// Only allow panning when zoomed in.
guard scale > 1.0 else { return }
offset = CGSize(
width: lastOffset.width + value.translation.width,
height: lastOffset.height + value.translation.height
)
}
.onEnded { _ in
lastOffset = offset
}
)
.animation(.interactiveSpring(), value: scale)
}
}
}
The MagnifyGesture reports a cumulative scale factor during the gesture. Multiplying by lastScale (updated in
onEnded) lets us preserve zoom across multiple pinch gestures. The DragGesture is added simultaneously so the user
can pan while zoomed in — the guard scale > 1.0 condition means drag gestures at 1× are passed through to the
TabView for page swiping.
Checkpoint: Build and run. Tap any photo in the grid to open the detail view. Swipe left and right to browse the full album, just like Boo flipping through her photo book. Pinch to zoom into a photo, then drag to pan around it. When you release the pinch below 1× zoom, the photo should snap back to fit the screen.
Step 6: Applying Core Image Filters
Core Image is Apple’s GPU-accelerated image processing framework. We’ll expose three filters: Sepia (vintage warmth), Noir (high-contrast black-and-white), and Vivid (boosted saturation) — perfect for making Boo’s monster photos look extra dramatic.
The processing happens in a dedicated service to keep the view code clean. Create Helpers/ImageFilterService.swift:
import CoreImage
import CoreImage.CIFilterBuiltins
import UIKit
struct ImageFilterService {
private static let context = CIContext()
/// Applies a PhotoFilter to a UIImage and returns the filtered result.
/// Returns the original image unchanged if the filter is .none.
static func apply(_ filter: PhotoFilter, to image: UIImage) -> UIImage? {
guard filter != .none else { return image }
guard let ciImage = CIImage(image: image) else { return nil }
let filtered: CIImage?
switch filter {
case .sepia:
let f = CIFilter.sepiaTone()
f.inputImage = ciImage
f.intensity = 0.85
filtered = f.outputImage
case .noir:
// CIPhotoEffectNoir is a preset filter — no parameters to tune.
let f = CIFilter(name: "CIPhotoEffectNoir")
f?.setValue(ciImage, forKey: kCIInputImageKey)
filtered = f?.outputImage
case .vivid:
let f = CIFilter.vibrance()
f.inputImage = ciImage
f.amount = 1.0 // +1.0 is maximum vibrance boost
filtered = f.outputImage
case .none:
return image
}
guard let outputImage = filtered,
let cgImage = context.createCGImage(outputImage, from: outputImage.extent)
else { return nil }
return UIImage(cgImage: cgImage, scale: image.scale, orientation: image.imageOrientation)
}
}
We create a single shared CIContext as a static property. CIContext is expensive to initialise (it allocates a GPU
pipeline), so creating one per filter call would tank performance. CIFilter.sepiaTone() and CIFilter.vibrance() use
the type-safe filter builder API introduced in iOS 13; CIPhotoEffectNoir uses the older string-based API because it
has no builder equivalent.
Now create Views/FilterPickerView.swift — a bottom sheet that shows the three filter options and applies the selected
one:
import SwiftUI
struct FilterPickerView: View {
@Bindable var photo: GalleryPhoto
@Environment(\.dismiss) private var dismiss
var body: some View {
VStack(spacing: 20) {
Text("Choose a Filter")
.font(.headline)
.padding(.top)
HStack(spacing: 16) {
ForEach(PhotoFilter.allCases, id: \.self) { filter in
FilterThumbnail(filter: filter, photo: photo, onSelect: {
applyFilter(filter)
})
}
}
.padding(.horizontal)
Button("Done") { dismiss() }
.buttonStyle(.borderedProminent)
.padding(.bottom)
}
.presentationDetents([.height(220)])
}
private func applyFilter(_ filter: PhotoFilter) {
guard let originalImage = UIImage(data: photo.imageData),
let filtered = ImageFilterService.apply(filter, to: originalImage),
let newData = filtered.jpegData(compressionQuality: 0.9),
let newThumb = ImageFilterService.apply(
filter,
to: UIImage(data: photo.thumbnailData) ?? originalImage
),
let thumbData = newThumb.jpegData(compressionQuality: 0.7)
else { return }
photo.imageData = newData
photo.thumbnailData = thumbData
photo.appliedFilter = filter
}
}
struct FilterThumbnail: View {
let filter: PhotoFilter
let photo: GalleryPhoto
let onSelect: () -> Void
var previewImage: UIImage? {
guard let base = UIImage(data: photo.thumbnailData) else { return nil }
return ImageFilterService.apply(filter, to: base)
}
var body: some View {
VStack(spacing: 6) {
if let img = previewImage {
Image(uiImage: img)
.resizable()
.scaledToFill()
.frame(width: 72, height: 72)
.clipped()
.cornerRadius(8)
.overlay(
RoundedRectangle(cornerRadius: 8)
.stroke(
photo.appliedFilter == filter ? Color.blue : Color.clear,
lineWidth: 2
)
)
}
Text(filter.displayName)
.font(.caption)
.foregroundStyle(.secondary)
}
.onTapGesture { onSelect() }
}
}
Add a filter button to PhotoDetailView’s toolbar and present the FilterPickerView as a sheet:
// Add to PhotoDetailView state:
@State private var showingFilters = false
// Add to toolbar:
ToolbarItem(placement: .bottomBar) {
Button {
showingFilters = true
} label: {
Label("Filters", systemImage: "wand.and.stars")
}
}
// Add sheet modifier:
.sheet(isPresented: $showingFilters) {
if let current = allPhotos[safe: currentIndex] {
FilterPickerView(photo: current)
}
}
Warning: The
applyFilterfunction inFilterPickerViewreplacesphoto.imageDatawith the filtered version. This is a destructive operation — the original is lost. In a production app, you’d want to store the original separately and apply the filter non-destructively at render time. For this tutorial, we keep it simple to focus on the Core Image integration.Checkpoint: Build and run. Open a photo in the detail view and tap the wand button in the bottom toolbar. A filter picker sheet should rise from the bottom. Tap Sepia and watch the preview thumbnails update in real time. Tap Done to confirm, then check that the grid thumbnail also reflects the filter. Boo’s monster photos now have that classic scrapbook aesthetic.
Step 7: Editing Photo Details and Captions
The last piece is a details sheet where Boo can add a caption to each photo — important context like “Kitty being scary (not really)” or “The Scare Floor, first day”.
Create Views/PhotoEditSheet.swift (or add this view to an existing file). Since photo is a SwiftData @Model
object, use @Bindable to get two-way bindings to its properties:
import SwiftUI
import SwiftData
struct PhotoEditSheet: View {
@Bindable var photo: GalleryPhoto
@Environment(\.dismiss) private var dismiss
@Environment(\.modelContext) private var modelContext
var body: some View {
NavigationStack {
Form {
Section("Caption") {
TextField("Add a caption…", text: $photo.caption, axis: .vertical)
.lineLimit(3...6)
}
Section("Details") {
LabeledContent("Added") {
Text(photo.dateAdded.formatted(date: .abbreviated, time: .shortened))
.foregroundStyle(.secondary)
}
LabeledContent("Filter") {
Text(photo.appliedFilter.displayName)
.foregroundStyle(.secondary)
}
}
Section {
Button("Delete Photo", role: .destructive) {
modelContext.delete(photo)
dismiss()
}
}
}
.navigationTitle("Photo Details")
.navigationBarTitleDisplayMode(.inline)
.toolbar {
ToolbarItem(placement: .topBarTrailing) {
Button("Done") { dismiss() }
}
}
}
}
}
@Bindable is the SwiftData-aware counterpart to @Binding. When you write to $photo.caption, SwiftData
automatically marks the context as dirty and saves the change the next time the context autosaves (which happens on
scene phase changes and on a periodic timer). You don’t need to call modelContext.save() manually.
The Delete Photo button calls modelContext.delete(photo), which removes the record from the persistent store. The
dismiss() call closes the sheet and navigates the TabView back to avoid displaying a deleted photo’s data.
Checkpoint: Build and run. Tap a photo to open the detail view, then tap the pencil button. The edit sheet should present with a text field for the caption and a read-only details section showing the date and applied filter. Type a caption, tap Done, and then open the sheet again to verify the caption persisted. Kill the app and relaunch — the caption should still be there, saved by SwiftData. Boo approves.
Where to Go From Here?
Congratulations! You’ve built Boo’s Monster World Photo Album — a fully functional photo gallery app with library picking, live camera capture, an adaptive grid, pinch-to-zoom, Core Image filters, and SwiftData persistence.
Here’s what you learned:
- How to use
PhotosPickerandPhotosPickerItem.loadTransferable(type:)to import photos from the library - How to generate thumbnails with
UIGraphicsImageRendererfor grid performance - How to wrap
UIImagePickerControllerinUIViewControllerRepresentablefor camera access - How to build an adaptive
LazyVGridphoto gallery - How to combine
MagnifyGestureandDragGesturefor pinch-to-zoom with panning - How to apply
CIFiltereffects using a sharedCIContext - How to persist photo data and captions with SwiftData’s
@Modeland@Bindable
Ideas for extending this project:
- iCloud Photos sync — Add the
CloudKitcapability and configure yourModelContainerwith aModelConfigurationthat enables iCloud syncing so Boo’s album follows her across devices. - Face detection with Vision — Use
VNDetectFaceRectanglesRequestto find faces in photos and automatically tag them — perfect for identifying which monster showed up in which shot. - Photo albums and collections — Add an
Albummodel toGalleryPhotoand build a collection view so Boo can organise her Monsters, Inc. trip photos separately from her Monsters University reunion shots. - Non-destructive filter pipeline — Store the original image data alongside the filtered version so the user can revert to the original at any time, similar to how Photos.app handles edits.
- Sharing — Add a share button that wraps the current photo in a
ShareLinkand presents the system share sheet.