How to Record a .wav File with AVAudioEngine (Acknowledgement part 7)

11 Mar 2022
4 minute read

Integrating the AVAudioEngine API into the rest of my app meant retooling some basic functionality, like saving .wav files.

Now that I have Acknowledgement working, there’s still work to do with the rest of my app’s audio code. I have to move everything from the old AVAudioPlayer and AVAudioRecorder APIs to the new AVAudioEngine. If I have some parts of the app using one API and other parts using a different audio API, they can step on each other and it gets messy.

When you record a clip in Reiterate, it gets saved as a .wav file. Under the old API, this was fairly easy:

let recordedFileURL = URL(fileURLWithPath: "recorded.wav", isDirectory: false, relativeTo: URL(fileURLWithPath: NSTemporaryDirectory()))
let settings = [
    AVFormatIDKey: Int(kAudioFormatLinearPCM),
    AVSampleRateKey: 12000.0,
    AVNumberOfChannelsKey: 1 as NSNumber,
    AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
    ] as [String : Any]
var recorder: AVAudioRecorder?

func startRecording() {
  do {
      recorder = try AVAudioRecorder(url: recordedFileURL, settings: settings)
      recorder?.record()
  } catch { print("\(error)") }
}

func endRecording() {
  recorder?.stop()
}

Under the AVAudioEngine API, it’s not quite so straightforward.

I wish the API were better documented. Getting this to work involved a lot of trial-and-error, with the app throwing opaque exceptions that didn’t tell me very much. After piecing together three different stackoverflow answers I finally managed to get something to work.

The main problem is that the audio sampling rates are different for the microphone and the .wav file, and you’re not allowed to change the sampling rate on the mic. So you need to create an AVAudioConverter to handle that. That creates an extra step. You record the file under one rate, using an input tap on the mic, and then convert it to your .wav file. Here’s the code.

First, we record the input audio to a .caf file via an input tap:

private var tmpRecordingURL = URL(fileURLWithPath: "inputclip.caf", isDirectory: false, relativeTo: URL(fileURLWithPath: NSTemporaryDirectory()))
private var recordedFile: AVAudioFile?
private var avAudioEngine: AVAudioEngine?

func installInputTap(block: @escaping AVAudioNodeTapBlock) {
    let inputNode = avAudioEngine.inputNode
    let micRecordingFormat = inputNode.outputFormat(forBus: 0)
    inputNode.installTap(onBus: 0, bufferSize: 1024, format: micRecordingFormat, block: block)
}

func startRecording() throws {
    recordedFile = try AVAudioFile(forWriting: tmpRecordingURL, settings: inputFormat.settings)
    installInputTap { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in
        do {
            try self.recordedFile?.write(from: buffer)
        } catch {
            print("Could not write buffer: \(error)")
        }
    }
}

Then, when we’re done, we re-read the temp file into a new buffer, and convert it to the final .wav:

func endRecording(saveTo fileURL: URL) throws {
    avAudioEngine.inputNode.removeTap(onBus: 0)
    recordedFile = nil

    guard let inputBuffer = ACAudioEngine.getBuffer(fileURL: tmpRecordingURL) else { throw ACAudioEngineError.bufferRetrieveError }

    // Adapted from https://stackoverflow.com/a/60802378/7662528
    let wavFile = try AVAudioFile(forWriting: toURL,
                                  settings: wavFormatSettings,
                                  commonFormat: inputFormat.commonFormat,
                                  interleaved: true)
    let converter = AVAudioConverter(from: inputFormat, to: wavFile.processingFormat)

    // Adapted from https://stackoverflow.com/a/44848090/7662528
    let sampleRateConversionRatio = recordingFormat.sampleRate / inputFormat.sampleRate
    let outputCapacity = AVAudioFrameCount(ceil( Double(inputBuffer.frameCapacity) * sampleRateConversionRatio ))
    guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: wavFile.processingFormat, frameCapacity: outputCapacity) else {
        throw ACAudioEngineError.bufferRetrieveError
    }

    // Adapted from https://stackoverflow.com/a/62471324/7662528
    var gotData = false
    var error: NSError?
    converter?.convert(to: outputBuffer, error: &error) { (numPackets, status) in
        if gotData {
            status.pointee = .noDataNow
            return nil
        }
        gotData = true
        status.pointee = .haveData
        return inputBuffer
    }
    try wavFile.write(from: outputBuffer)
}

This seems to work. It produces .wav files I can play again, and I can even extract them from my app container and play them like any other audio file.

I’m not showing the code to create the avAudioEngine since that’s fairly boilerplate. The only other function here is getBuffer(fileURL:) but that’s from the Apple sample code AVEchoTouch.

Acknowledgement Complete

With this last piece I was able to send a beta version, and it’s gotten really positive feedback. At first glance, it looks like Acknowledgement increases mindfulness by a factor of ten!

There are still a few minor bugs to be worked out, but I should soon have this new version up and running on the App Store.

Tagged with

Comments and Webmentions


You can respond to this post using Webmentions. If you published a response to this elsewhere,

This post is licensed under CC BY 4.0