我正在用AVAudioEngine
、AVAudioPlayerNode
和AVAudioPCMBuffer
制作节拍器。缓冲区是这样创建的:
/// URL of the sound file
let soundURL = Bundle.main.url(forResource: <filename>, withExtension: "wav")!
/// Create audio file
let audioFile = try! AVAudioFile(forReading: soundURL)
let audioFormat = audioFile.processingFormat
/// Create the buffer - what value to put for frameCapacity?
if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: ???) {
buffer.frameLength = audioFrameCount
try? audioFile.read(into: buffer)
return buffer
}
我在AVAudioPCMBuffer初始值设定项中为frameCapacity设置了什么值?
文件中说frameCapacity应该是";PCM样本帧中的缓冲区容量"这是什么意思?这是一个静态值还是从音频文件中获取的?
frameCapacity
是AVAudioPCMBuffer
可以容纳的最大帧数。你不必使用所有的框架。AVAudioPCMBuffer
的消费者应该只查阅frameLength
帧和frameLength <= frameCapacity
。如果你在处理以N
帧为单位的音频,并且无论出于什么原因,你都会得到一个简短的读取,那么与长度不同的容量可能会很有用:
let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: N)
while readChunk(chunk) {
let frameLength = min(buffer.frameCapacity, chunk.lengthInFrames)
// copy frameLength frames into buffer.floatChannelData[0] or something
buffer.frameLength = chunkLength // could be less than N
}
但是,如果你只打算在缓冲区中存储audioFrameCount
帧(文件的长度?(,那么将frameCapacity
设置为:
let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount)