我正在尝试编写一款可以进行数字信号处理的应用程序,并希望它尽可能轻。有一件事让我困惑了一段时间,那就是各种设备的默认值可能是多少,这样我就可以避免在从缓冲区接收数据之前发生不必要的转换。我发现了以下链接http://club15cc.com/code-snippets/ios-2/get-the-default-output-stream-format-for-an-audio-unit-in-ios这让我走上了我认为正确的道路。
我已经扩展了链接中的代码,以便在获取ASBD(AudioStreamBasicDescription)内容之前创建和激活AVAudioSession,然后可以使用AudioSessions请求各种"首选"设置,以查看它们的影响。我还将列出ASBD值的Apple代码与上面链接中的代码结合在一起。
下面的代码被放入ViewController.m文件中,该文件是通过选择Single View应用程序模板生成的。请注意,您需要将AudioToolbox.framework和CoreAudio.framework添加到项目的链接框架和库中。
#import "ViewController.h"
@import AVFoundation;
@import AudioUnit;
@interface ViewController ()
@end
@implementation ViewController
- (void) printASBD:(AudioStreamBasicDescription) asbd {
char formatIDString[5];
UInt32 formatID = CFSwapInt32HostToBig (asbd.mFormatID);
bcopy (&formatID, formatIDString, 4);
formatIDString[4] = ' ';
NSLog (@" Sample Rate: %10.0f", asbd.mSampleRate);
NSLog (@" Format ID: %10s", formatIDString);
NSLog (@" Format Flags: %10X", (unsigned int)asbd.mFormatFlags);
NSLog (@" Bytes per Packet: %10d", (unsigned int)asbd.mBytesPerPacket);
NSLog (@" Frames per Packet: %10d", (unsigned int)asbd.mFramesPerPacket);
NSLog (@" Bytes per Frame: %10d", (unsigned int)asbd.mBytesPerFrame);
NSLog (@" Channels per Frame: %10d", (unsigned int)asbd.mChannelsPerFrame);
NSLog (@" Bits per Channel: %10d", (unsigned int)asbd.mBitsPerChannel);
}
- (void)viewDidLoad
{
[super viewDidLoad];
NSError *error = nil;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
// Get a reference to the AudioSession and activate it
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
[audioSession setActive:YES error:&error];
// Then get RemoteIO AudioUnit and use it to get the content of the default AudioStreamBasicDescription
AudioUnit remoteIOUnit;
AudioComponentDescription audioComponentDesc = {0};
audioComponentDesc.componentType = kAudioUnitType_Output;
audioComponentDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioComponentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
// Get component
AudioComponent audioComponent = AudioComponentFindNext(NULL, &audioComponentDesc);
AudioComponentInstanceNew(audioComponent, &remoteIOUnit);
// Read the stream format
size_t asbdSize = sizeof(AudioStreamBasicDescription);
AudioStreamBasicDescription asbd = {0};
AudioUnitGetProperty(remoteIOUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
0,
(void *)&asbd,
&asbdSize);
[self printASBD:asbd];
}
@end
我很想知道人们对其他实际硬件的结果。请注意,该代码已构建并部署到IOS 7.1
格式标志为:
kAudioFormatFlagIsFloat = (1 << 0), // 0x1
kAudioFormatFlagIsBigEndian = (1 << 1), // 0x2
kAudioFormatFlagIsSignedInteger = (1 << 2), // 0x4
kAudioFormatFlagIsPacked = (1 << 3), // 0x8
kAudioFormatFlagIsAlignedHigh = (1 << 4), // 0x10
kAudioFormatFlagIsNonInterleaved = (1 << 5), // 0x20
kAudioFormatFlagIsNonMixable = (1 << 6), // 0x40
kAudioFormatFlagsAreAllClear = (1 << 31),
我为iPad4获得的结果如下:
Sample Rate: 0
Format ID: lpcm
Format Flags: 29
Bytes per Packet: 4
Frames per Packet: 1
Bytes per Frame: 4
Channels per Frame: 2
Bits per Channel: 32
我想lpcm(线性脉冲编码调制)并不奇怪,格式标志=x'29'kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked
以及每个通道32位似乎表示预期的8.24"固定浮点"。