Supporting AirPlay playback with AVFoundation

Ever encountered this error?: The operation couldn’t be completed. (OSStatus error -50.)

I was working on an audio recording and audio playback feature for my app MomentSnap.

The AudioRecorder I am currently using is really basic and supports only Mono Audio at the moment.

//
//  AudioRecorder.swift
//  MomentSnap
//
//  Created and copyrighted by Yann Berton on 22.12.25.
//

import AVFAudio
import Observation
import UniformTypeIdentifiers

@Observable
public class AudioRecorder {
    private let audioSession: AVAudioSession = AVAudioSession.sharedInstance()
    private var recorder: AVAudioRecorder?

    public var isRecording = false

    private var tempAudioPath: URL?

    enum Errors: LocalizedError {
        case permissionDenied
        case unknownPermission

        case setupFailure
        case recordStartFailure
        case noPathForAudio
        case noRecorder

        case fileReadFailure
    }

    public init() {}

    private func checkPermission() async throws {
        let permission = AVAudioApplication.shared.recordPermission

        switch permission {
        case .undetermined:
            let result = await AVAudioApplication.requestRecordPermission()
            guard result else { throw Errors.permissionDenied }
            return
        case .denied:
            throw Errors.permissionDenied
        case .granted:
            return
        @unknown default:
            throw Errors.unknownPermission
        }
    }

    public func configureAudioSession() async throws {
        try await checkPermission()
        do {
            try audioSession
                .setCategory(
                    .record,
                    mode: .default,
                    options: [
                        .allowBluetoothHFP, .bluetoothHighQualityRecording,
                    ]
                )
            try audioSession.setActive(true)
        } catch {
            throw Errors.setupFailure
        }
    }

    public func startRecording() throws {
        let tempDir = FileManager.default.temporaryDirectory
        let fileName = "temp_recording_\(UUID().uuidString).m4a"
        tempAudioPath = tempDir.appendingPathComponent(fileName)

        let audioSettings: [String: Any] = [
            AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
            AVSampleRateKey: 44100,
            AVNumberOfChannelsKey: 1,
            AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue,
        ]

        guard let tempAudioPath else { throw Errors.noPathForAudio }

        recorder = try AVAudioRecorder(
            url: tempAudioPath,
            settings: audioSettings
        )

        guard let recorder else { throw Errors.noRecorder }

        let result = recorder.record()

        guard result else { throw Errors.recordStartFailure }
        isRecording = true
    }

    public func stopRecording() throws {
        self.recorder?.stop()
        self.recorder = nil
        self.isRecording = false
    }

    public func getAudioDataAndCleanup() throws -> Data {
        guard let url = tempAudioPath else { throw Errors.noPathForAudio }

        do {
            let data = try Data(contentsOf: url)
            try FileManager.default.removeItem(at: url)
            self.tempAudioPath = nil
            return data
        } catch {
            throw Errors.fileReadFailure
        }
    }
}

And as you can see, specifically in this line:

try audioSession.setCategory(
      .record,
      mode: .default,
      options: [
           .allowBluetoothHFP, .bluetoothHighQualityRecording,
      ]
)

I set up the AVAudioSession to support high quality recording via Bluetooth and set the session mode to record only.
There is also the option for „playAndRecord“, but in my case, the user will not immediately hear the recorded audio, thus it is not needed to have the session also be prepared to play the audio back.

This gave me the freedom to work on a separate Observable for audio playback only.

//
//  AudioPlayer.swift
//  MomentSnap
//
//  Created and copyrighted by Yann Berton on 22.12.25.
//

import AVFAudio
import AVFoundation
import Observation

@Observable
public class AudioPlayer: NSObject, AVAudioPlayerDelegate {

    private var player: AVAudioPlayer?

    private var tempUrl: URL?

    public enum PlaybackState {
        case stopped, paused, playing
    }

    public enum Errors: LocalizedError {
        case noAudio
        case failureToPlay
        case failureToConvertToTemp
        case sessionError
    }

    public var playbackState: PlaybackState = .stopped

    public override init() {
        super.init()
    }

    public func loadAudio(data: Data) throws {
        stop()

        guard !data.isEmpty else { return }
        let tempDir = FileManager.default.temporaryDirectory
        let fileUrl = tempDir.appendingPathComponent("playback_temp.m4a")

        do {
            try data.write(to: fileUrl)
            self.tempUrl = fileUrl
            player = try AVAudioPlayer(contentsOf: fileUrl)
            player?.delegate = self
            player?.prepareToPlay()
        } catch {
            throw Errors.failureToConvertToTemp
        }
    }

    public func play() throws {
        if playbackState == .paused, let player = player {
            player.play()
            playbackState = .playing
            return
        }

        guard let player = player else { throw Errors.noAudio }

        let session = AVAudioSession.sharedInstance()
        do {
            try? session.setActive(false)

            try session
                .setCategory(
                    .playback,
                    mode: .default,
                    options: [.duckOthers]
                )
            try session.setActive(true)

            if player.play() {
                playbackState = .playing
            } else {
                throw Errors.failureToPlay
            }
        } catch {
            throw Errors.sessionError
        }
    }

    public func pause() {
        player?.pause()
        playbackState = .paused
    }

    private func stop() {
        player?.stop()
        player = nil
        playbackState = .stopped

        if let url = tempUrl {
            try? FileManager.default.removeItem(at: url)
            tempUrl = nil
        }
    }

    public func audioPlayerDidFinishPlaying(
        _ player: AVAudioPlayer,
        successfully flag: Bool
    ) {
        self.stop()
    }
}

As you can see, this too, is very straightforward. This setup already allows for AirPlay and other system features.

I initially thought I needed to enable those features specifically, why?
Because there is the option to write the following code:

.setCategory(
    .playback,
    mode: .default,
    options: [.duckOthers, .allowAirPlay, .allowBluetoothA2DP]
)

This is very sneaky though, as writing this code will result in a runtime exception that is very non-telling in nature:

The operation couldn’t be completed. (OSStatus error -50.)

When I first had this error, I immediately thought to check the byte size of my Data to see if it was 0 or where the problem arose from. I then circled it in to be within this exact line above.
The reason is, .allowsAirPlay and .allowsBluetoothA2DP are only to be set with a .playAndRecord session.

The Apple Developer Documentation explains it here
https://developer.apple.com/documentation/avfaudio/avaudiosession/categoryoptions-swift.struct/allowairplay

Sadly, it took a long time for me to find to this sentence on the page, because I didn’t initially expect it to throw an error for this reason.

So? Main takeaways of this short read:
The .playback session type implicitly already supports AirPlay and other system features like it. Do NOT try to set it, or else it will result in the error „The operation couldn’t be completed. (OSStatus error -50.)“

Thanks for reading, hope it helps someone :)


PS:
If anyone wonders why the AudioPlayer inherits NSObject and AVAudioPlayerDelegate, this is because no AVAudioPlayer properties are observables, thus if you want UI to react to it, you need a delegate that responds to callbacks from AVAudioPlayer. The AVAudioPlayerDelegate also requires the class to be of type NSObject, thus I added conformance to it. You can see an example of reacting to an AVAudioPlayer signal in my audioPlayerDidFinishPlaying implementation.


Update History

Added a PS

Reading the article again, I thought it remained unclear why conformance to AVAudioPlayerDelegate was added, so I adjusted the article ending to explain it.

Created new article titled 'Supporting AirPlay playback with AVFoundation'