This section contains code samples illustrating a variety of common Audiobus-related tasks.
More sample code is available within the "Samples" folder of the SDK distribution.
Create a sender port and send audio manually
This code snippet demonstrates how to create a sender port, and then send audio through it manually, without using ABAudioSenderPort's audio unit initialiser. Note that the audio unit method is recommended as it's much simpler, but there may be circumstances under which more control is needed, such as when you are publishing multiple sender ports.
As of iOS 13, an app launched into the background is only permitted to begin recording if the Remote IO node is currently being hosted via Inter-App Audio. If your app requires recording abilities at launch, and you are using the method below then you should tick the box on the Audiobus app registration page labelled "Launch Manually". This will make Audiobus require a tap to launch your app into the foreground, where it will be allowed to begin recording, rather than automatically launching it into the background.
The code below also demonstrates how to use the result of ABAudioSenderPortIsMuted to determine when to mute output.
@interface MyAudioEngine ()
@end
@implementation MyAudioEngine
-(id)init {
...
title:NSLocalizedString(@"Main App Output", @"")
audioComponentDescription:(AudioComponentDescription) {
.componentType = kAudioUnitType_RemoteGenerator,
.componentSubType = 'subt',
.componentManufacturer = 'manu' }];
[self.audiobusController addAudioSenderPort:_sender];
...
}
...
static OSStatus audioUnitRenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
__unsafe_unretained MyAudioEngine *self = (__bridge MyAudioEngine*)inRefCon;
...
ABAudioSenderPortSend(self->_sender, ioData, inNumberFrames, inTimeStamp);
if ( ABAudioSenderPortIsMuted(self->_sender) ) {
for ( int i=0; i<ioData->mNumberBuffers; i++ ) {
memset(ioData->mBuffers[i].mData, 0, ioData->mBuffers[i].mDataByteSize);
}
*ioActionFlags |= kAudioUnitRenderAction_OutputIsSilence;
}
}
Create a filter port with a process block
This demonstrates how to create and implement a filter port with a process block. Using a process block is more complex than using ABAudioFilterPort's audio unit initialiser, but may provide more flexibility under certain circumstances, such as when you are publishing multiple filter ports.
The code creates a filter port, providing a processing implementation block which is invoked whenever audio arrives on the input side of the filter. After the block is called, during which your app processes the audio in place, Audiobus will automatically send the processed audio onwards.
The code also demonstrates how to mute your audio system when the filter port is connected.
@interface MyAudioEngine ()
@end
@implementation MyAudioEngine
-(id)init {
...
title:@"Main Effect"
audioComponentDescription:(AudioComponentDescription) {
.componentType = kAudioUnitType_RemoteEffect,
.componentSubType = 'myfx',
.componentManufacturer = 'you!' }
processBlock:^(AudioBufferList *audio, UInt32 frames, AudioTimeStamp *timestamp) {
processAudio(audio);
} processBlockSize:0];
[self.audiobusController addFilterPort:_filter];
...
}
...
static OSStatus audioUnitRenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
__unsafe_unretained MyAudioEngine *self = (__bridge MyAudioEngine*)inRefCon;
if ( ABAudioFilterPortIsConnected(self->_filter) ) {
for ( int i=0; i<ioData->mNumberBuffers; i++ ) {
memset(ioData->mBuffers[i].mData, 0, ioData->mBuffers[i].mDataByteSize);
}
*ioActionFlags |= kAudioUnitRenderAction_OutputIsSilence;
return noErr;
}
...
}
Create a receiver port and receive audio
This code illustrates the typical method of receiving audio from Audiobus.
The code creates a single receiver port, assigns an AudioStreamBasicDescription describing the audio format to use, then uses the port to receive audio from within a Remote IO input callback.
@interface MyAudioEngine ()
@end
@implementation MyAudioEngine
-(id)init {
...
self.receiver = [[
ABAudioReceiverPort alloc] initWithName:@"Main" title:NSLocalizedString(@"Main Input", @"")];
_receiver.
clientFormat = [MyAudioEngine myAudioDescription];
[self.audiobusController addReceiverPort:_receiver];
...
}
...
static OSStatus audioUnitRenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
__unsafe_unretained MyAudioEngine *self = (__bridge MyAudioEngine*)inRefCon;
AudioTimeStamp timestamp = *inTimeStamp;
if ( ABAudioReceiverPortIsConnected(self->_receiver) ) {
ABAudioReceiverPortReceive(self->_receiver, nil, ioData, inNumberFrames, ×tamp);
} else {
AudioUnitRender(self->_audioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData);
}
}
Create a trigger
This demonstrates how to create a trigger, which can be invoked remotely to perform some action within your app.
The sample creates a trigger, passing in a block that toggles the recording state of a fictional transport controller.
It also observes the recording state of the controller, and updates the trigger's state when the recording state changes, so that the appearance of the user interface element corresponding to the trigger on remote apps changes appropriately.
static void * kTransportControllerRecordingStateChanged = &kTransportControllerRecordingStateChanged;
...
if ( self.transportController.recording ) {
[self.transportController endRecording];
} else {
[self.transportController beginRecording];
}
}];
[self.audiobusController addTrigger:self.recordTrigger];
[self.transportController addObserver:self forKeyPath:@"recording" options:0 context:kTransportControllerRecordingStateChanged];
...
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if ( context == kTransportControllerRecordingStateChanged ) {
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
Manage application life-cycle
This example demonstrates the recommended way to manage your application's life-cycle.
The example assumes the app in question has been registered at developer.audiob.us/register, and is therefore able to be connected and launched from the Audiobus app.
As soon as your app is connected via Audiobus, it must have a running and active audio system. This means you must either only instantiate the Audiobus controller at the same time you start your audio system, or you must watch for ABConnectionsChangedNotification and start your audio system when the notification is observed.
Once your app is connected via Audiobus (or IAA), it should not under any circumstances suspend its audio system when moving into the background. When moving to the background, the app can check the connected property of the Audiobus controller, and only stop the audio system if the value is NO:
if ( !_audiobusController.connected ) {
[
ABAudioUnitFader fadeOutAudioUnit:_audioEngine.audioUnit completionBlock:^{ [_audioEngine stop]; }];
}
The below example uses ABAudioUnitFader to provide smooth fade-in and fade-out transitions, to avoid hard clicks when starting or stopping the audio system.
static void * kAudiobusConnectedChanged = &kAudiobusConnectedChanged;
-(BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
[self.audiobusController addObserver:self
forKeyPath:@"connected"
options:0
context:kAudiobusConnectedChanged];
}
-(void)dealloc {
[_audiobusController removeObserver:self forKeyPath:@"connected"];
}
-(void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context {
if ( context == kAudiobusConnectedChanged ) {
if ( [UIApplication sharedApplication].applicationState == UIApplicationStateBackground
&& !_audiobusController.connected ) {
[_audioEngine stop];
}
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
-(void)applicationDidEnterBackground:(NSNotification *)notification {
if ( !_audiobusController.connected ) {
[
ABAudioUnitFader fadeOutAudioUnit:_audioEngine.audioUnit completionBlock:^{ [_audioEngine stop]; }];
}
}
-(void)applicationWillEnterForeground:(NSNotification *)notification {
[
ABAudioUnitFader fadeInAudioUnit:_audioEngine.audioUnit beginBlock:^{ [_audioEngine start]; } completionBlock:nil];
}
}
Determine if app is connected via Audiobus
The following code demonstrates one way to monitor and determine whether any Audiobus ports are currently connected.
You can also:
- Observe (via KVO) the 'connected' property of ABAudiobusController or any of the port classes, or any of the 'sources'/'destinations' properties of the port classes
- Watch for
ABAudioReceiverPortConnectionsChangedNotification
, ABAudioReceiverPortPortAddedNotification
, ABAudioReceiverPortPortRemovedNotification
, ABAudioSenderPortConnectionsChangedNotification
, or ABAudioFilterPortConnectionsChangedNotification
.
- Use
ABAudioReceiverPortIsConnected
, ABAudioSenderPortIsConnected
, and ABAudioFilterPortIsConnected
from a Core Audio thread.
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(connectionsChanged:)
name:ABConnectionsChangedNotification
object:nil];
[[NSNotificationCenter defaultCenter] removeObserver:self
object:nil];
-(void)connectionsChanged:(NSNotification*)notification {
if ( _audiobusController.connected ) {
} else {
}
}
Enumerate apps connected to a port
This illustrates how to inspect each individual source or destination of a port. Sender ports can have only destinations, receiver ports only sources. Filter ports can have both, sources and destinations.
The way you obtain access to sources has changed in Audiobus 3. Audiobus 3 inserts intermediate routings. Thus the sources obtained by ABPort::sources are not the source you are seeing in the Audiobus UI. Using ABPort::sources and ABPort::destinations you will get the physically connected sources and destinations. To represent sources and destinations in the user inteface of your app we recomment to use the new function ABPort::sourcesRecursive and ABPort::destinationsRecursive.
To get the physically connected sources iterate the sources property of your port:
for (
ABPort *connectedPort in _receiverPort.sources ) {
NSLog(@"Source port '%@' of app '%@' is connected", connectedPort.displayName, connectedPort.peer.displayName);
}
To get the logically connected sources iterate the sourcesRecursive property of your port. This function will not only return direct sources but also indirect ones:
NSLog(@"Source port '%@' of app '%@' is connected", connectedPort.displayName, connectedPort.peer.displayName);
}
The same is possible with destinations:
for (
ABPort *connectedPort in _senderPort.destinations ) {
NSLog(@"Destination port '%@' of app '%@' is connected", connectedPort.displayName, connectedPort.peer.displayName);
}
NSLog(@"Destination port '%@' of app '%@' is connected", connectedPort.displayName, connectedPort.peer.displayName);
}
Show icons and titles for sources and destinations
To show the titles and icons of sources connected to a port use the new properties sourcesIcon and sourcesTitle as well destinationsIcon and destinationsTitle:
UIImage *sourcesIcon = _filterPort.sourcesIcon;
NSString *sourcesTitle = _filterPort.sourcesIcon;
UIImage *destinationsIcon = _filterPort.destinationsIcon;
NSString *destinationsTitle = _filterPort.destinationsIcon;
These properties will return a summarized icon and title representing all sources and destinations connected to a port.
If you need access to the icons of the single sources you can iterate the sources and use the properties peer.icon and peer.name:
NSString sourcePeerName = *sourcePort.peer.name;
UIImage *sourcePeerIcon = *sourcePort.peer.icon;
}
The same can be done with destinations.
Get all sources of the current Audiobus session
This example demonstrates how to obtain a list of all source ports of the current session; that is, all ports that correspond to the 'Inputs' position in the Audiobus app. Note that this is a different list of ports than the ones enumerated in the prior sample, as this is list of all inputs, not just the ones directly connected to a given port.
NSArray *allSessionSources = [_audiobusController.connectedPorts filteredArrayUsingPredicate:
Note: similarly, you can obtain a list of all filters by replacing the ABPortTypeAudioSender
identifier with ABPortTypeAudioFilter
, and a list of all receivers with the ABPortTypeAudioReceiver
.
Receive audio as separate streams
This example demonstrates how to use ABAudioReceiverPort's separate-stream receive mode (receiveMixedAudio = NO) to receive each audio stream from each connected app separately, rather than as a single mixed-down audio stream.
The code below maintains a C array of currently-connected sources, in order to be able to enumerate them within a Core Audio thread without calling any Objective-C methods (note that Objective-C methods should never be called on a Core Audio thread due to the risk of priority inversion, resulting in stuttering audio).
The sample code monitors connection changes, then updates the C array accordingly.
Then within the audio unit render callback, the code iterates through this array to receive each audio stream.
static const int kMaxSources = 30;
static void * kReceiverSourcesChanged = &kReceiverSourcesChanged;
struct port_entry_t { void *port; BOOL pendingRemoval; };
@interface MyAudioEngine () {
struct port_entry_t _portTable[kMaxSources];
}
@end
@implementation MyAudioEngine
-(id)init {
...
self.receiver = [[
ABAudioReceiverPort alloc] initWithName:@"Main" title:NSLocalizedString(@"Main Input", @"")];
_receiver.clientFormat = [MyAudioEngine myAudioDescription];
_receiver.receiveMixedAudio = NO;
[self.audiobusController addReceiverPort:_receiver];
[_receiver addObserver:self forKeyPath:@"sources" options:0 context:kReceiverSourcesChanged];
}
-(void)dealloc {
[_receiver removeObserver:self forKeyPath:@"sources"];
}
-(struct port_entry_t*)entryForPort:(
ABPort*)port {
for ( int i=0; i<kMaxSources; i++ ) {
if ( _portTable[i].port == (__bridge void*)port ) {
return &_portTable[i];
}
}
return NULL;
}
-(void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context {
if ( context == kReceiverSourcesChanged ) {
for (
ABPort *source in _receiver.sources ) {
if ( ![self entryForPort:source] ) {
struct port_entry_t *emptySlot = [self entryForPort:nil];
if ( emptySlot ) {
emptySlot->port = (__bridge void*)source;
}
}
}
for ( int i=0; i<kMaxSources; i++ ) {
if ( _portTable[i].port && ![_receiver.sources containsObject:(__bridge
ABPort*)_portTable[i].port] ) {
_portTable[i].pendingRemoval = YES;
}
}
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
...
static OSStatus audioUnitRenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
__unsafe_unretained MyAudioEngine *self = (__bridge MyAudioEngine*)inRefCon;
for ( int i=0; i<kMaxSources; i++ ) {
if ( self->_portTable[i].port && self->_portTable[i].pendingRemoval ) {
self->_portTable[i].pendingRemoval = NO;
self->_portTable[i].port = NULL;
}
}
if ( ABAudioReceiverPortIsConnected(self->_receiver) ) {
for ( int i=0; i<kMaxSources; i++ ) {
if ( self->_portTable[i].port ) {
AudioTimeStamp timestamp;
ABAudioReceiverPortReceive(self->_receiver, (__bridge
ABPort*)self->_portTable[i].port, ioData, inNumberFrames, ×tamp);
}
}
ABAudioReceiverPortEndReceiveTimeInterval(self->_receiver);
} else {
AudioUnitRender(self->_audioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData);
}
}
Use Audiobus input in an Audio Queue
This example demonstrates the Audio Queue versions of the receiver port receive functions, which take an AudioQueueBufferRef argument instead of an AudioBufferList.
Illustrated is an input callback which replaces the incoming microphone audio with audio from Audiobus, which represents a quick and easy way to implement receiver ports in an app that uses Audio Queues and microphone input.
static void MyAQInputCallback(void *inUserData,
AudioQueueRef inQueue,
AudioQueueBufferRef inBuffer,
const AudioTimeStamp *inStartTime,
UInt32 inNumPackets,
const AudioStreamPacketDescription *inPacketDesc) {
__unsafe_unretained MyController *self = (MyController*)inUserData;
AudioTimeStamp timestamp = *inStartTime;
ABAudioReceiverPortReceiveAQ(self->_audiobusReceiverPort,
nil,
inBuffer,
&inNumPackets,
×tamp,
NULL);
}