Integration Guide

Please read this guide carefully, as there are some important things you need to know to effectively support Audiobus in your app.

Particularly if you intend to receive audio from Audiobus, set aside ten minutes to read through to make sure you have a clear picture of how it all works.

Many app developers will be able to implement Audiobus in just thirty minutes or so.

This quick-start guide assumes your app uses the Core Audio C API or AVAudioEngine. If this is not the case, most of it will still be relevant, but you'll need to do some additional integration work which is beyond the scope of this documentation.

General Design Principles

We've worked hard to make Audiobus as close as possible to an "it just works" experience for users. We think music on iOS should be easy and open to everyone, not just those technical enough to understand convoluted settings.

That means you should add no switches to enable/disable Audiobus, no settings that users need to configure to enable your app to run in the background while connected to Audiobus.

If you're a sender app or a filter app (i.e. you have an ABAudioSenderPort, ABAudioFilterPort, ABMIDISenderPort or ABMIDIFilterPort, and only send to other apps or filter audio/MIDI from other apps), you shouldn't need to ever add any Audiobus-specific UI. Audiobus takes care of all session management for you. If you're a receiver app (you have an ABAudioReceiverPort or an ABMIDIReceiverPort) then unless you're doing nifty things with multitrack recording, you shouldn't need to add Audiobus-specific UI either.

Additionally, you should not offer Audiobus support as an in-app purchase, as this violates the "just works" principle. We would be unable to list such apps in our Compatible Applications directory due to the customer frustration and support requests this would generate.

We reserve the option to remove apps offering Audiobus support as an in-app purchase from the Audiobus Compatible Apps directory, or to ban them from Audiobus entirely.

Audiobus' audio sender port is extremely lightweight when not connected: the send function ABAudioSenderPortSend will consume a negligible amount of CPU, so you can use it even while not connected to Audiobus, for convenience.

If you find yourself implementing stuff that seems like it should've been in Audiobus, tell us. It's probably already in there. If it's not, we'd be happy to consider putting it in ourselves so you, and those who come after you, don't have to.

In short: whenever possible, keep it simple. Your users will thank you, and you'll have more development time to devote to the things you care about.

1. Determine if your app will work with Audiobus

Audiobus relies heavily on multitasking, and one thing that is vital in apps that work together is that they are able to perform adequately alongside other apps, in a low-latency Audiobus environment.

The primary factor affecting whether your app will work with Audiobus is whether your app can perform properly with a hardware IO buffer duration of 5ms (256 frames at 44.1kHz, 128 frames at 22kHz, etc) while other apps are running.

Your app must be prepared to handle a buffer length of 5ms (256 frames), when running alongside other apps, without glitching, on the iPad 3 and above, or iPhone 5 and above. You can test this prior to beginning implementation of Audiobus support by opening the Audiobus app, with your app closed, then opening your app afterwards, which should force your app to a 5ms buffer duration. Push your app hard, and listen for glitches in the audio output. Ideally, you should also test while running additional audio apps in the background.

If your app does not support a hardware buffer duration of 5ms without demonstrating performance problems on the iPad 3 and up, or the iPhone 5 and up, then we reserve the option to not list it in the Audiobus-compatible app listing on our website and within the Audiobus app, or to ban it from Audiobus entirely.

There is a sample project, audioIO, which can be used as starting point for audio apps. There's a problem in this code's AudioUnitPropertyChangeDispatcher function, where it calls [audio addAudioUnitPropertyListener]. This modifies the audio unit property change notification dispatch table mid-dispatch, which causes a data integrity error that causes other registered notify callbacks to not be called. This causes problems within the Audiobus library, which relies on these notifications - in particular, it causes silent audio in sender and filter ports, among other things.

Removing this [audio addAudioUnitPropertyListener] line addresses the problem.

If you're interacting with the audio session (via AVAudioSession or the old C API), you must set the audio session category and "mix with others" flag before setting the audio session active. If you do this the other way around, you'll get some weird behaviour, like silent output when used with IAA.

2. Add the Audiobus SDK to Your Project

Audiobus is distributed as an XCFramework, or as a static library plus the associated header files.

The easiest way to add Audiobus to your project is using CocoaPods:

  1. If you don't have a Podfile at the top level of your project, create a file called "Podfile".
  2. Open your Podfile and add the following code, replacing testTarget with the name of your target:
    target 'testTarget' do
    pod 'Audiobus'
    end
  3. In the terminal and in the same folder, type:
    pod install
    In the future when you're updating your app, use pod outdated to check for available updates, and pod update to apply those updates.

Alternatively, if you aren't using CocoaPods:

    1. If you wish to use the XCFramework, copy Audiobus.xcframework into an appropriate place within your project directory, then drag it into your Xcode project and make sure your app target is selected beneath "Add to targets". Then open the General tab for your app target, and under "Frameworks and Libraries" ensure that "Embed & Sign" is selected for Audiobus.xcframework.
    2. If you wish to use the static library, copy libAudiobus.a and the associated header files into an appropriate place within your project directory, then drag both the header files and libAudiobus.a into your project. In the sheet that appears, make sure your app target is selected beneath "Add to targets". Note that this will modify your app's "Header Search Paths" and "Library Search Paths" build settings.
  1. Ensure the following frameworks are added to your build process (to add frameworks, select your app target's "Link Binary With Libraries" build phase, and click the "+" button):
    • AVFoundation
    • CoreGraphics
    • Accelerate
    • AudioToolbox
    • QuartzCore
    • Security
    • libz.tbd

Note that for technical reasons the Audiobus SDK supports iOS 8.0 and up only.

3. Configure your project

Enable Background Audio and Inter-App Audio

If you haven't already done so, you must enable background audio and Inter-App Audio in your app – even if you plan to create a MIDI-only app.

To enable these:

  1. Open your app target screen within Xcode by selecting your project entry at the top of Xcode's Project Navigator, and selecting your app from under the "TARGETS" heading.
  2. Select the "Capabilities" tab.
  3. Underneath the "Background Modes" section, make sure you have "Audio and AirPlay" ticked.
  4. To the right of the "Inter-App Audio" title, turn the switch to the "ON" position – this will cause Xcode to update your App ID with Apple's "Certificates, Identifiers & Profiles" portal, and create or update an Entitlements file.

Managing Your App's Life-Cycle

Your app will only continue to run in the background if you have an active, running audio system. This means that if you stop your audio system while your app is in the background or moving to the background, your app will cease to run and will become unresponsive to Audiobus.

Consequently, care must be taken to ensure your app is running and available when it needs to be.

Firstly, you must ensure you have a running and active audio session once your app is connected via Audiobus, regardless of the state of your app (even if you get an audio session interruption). You can do this two ways:

  1. Make sure you only instantiate the Audiobus controller (Step 7) once your audio system is running.
  2. Register to receive ABConnectionsChangedNotification notifications (or observe ABAudiobusController's connected property), and start your audio engine if the Audiobus controller is connected.
  3. If you get an audio session interruption, and the Audiobus controller is connected, restart your audio system.

If do not do this correctly, your app may suspend in the background before an Audiobus connection has been completed, rendering it unable to work with Audiobus.

Secondly, you should suspend your app (by stopping your audio system) when it becomes disconnected from Audiobus – even if your app has a "Run in Background" setting. This helps to avoid 'zombie' apps sticking around while being invisible in the multitasking menu.

You must not under any circumstances suspend your app if the connected property of the Audiobus controller is YES, even if you receive an audio session interruption. If you do, then Audiobus will cease to function properly with your app, and your users will hear an unpleasant buzz sound.

The following describes the background policy we strongly recommend for use with Audiobus.

  1. When your app moves to the background, you should only stop your audio engine if you are not currently connected via either Audiobus or Inter-App Audio, which can be determined via the connected property of ABAudiobusController. For example:
    -(void)applicationDidEnterBackground:(NSNotification *)notification {
    if ( !_audiobusController.connected ) {
    // Fade out and stop the audio engine, suspending the app, if we're not connected
    [ABAudioUnitFader fadeOutAudioUnit:_audioEngine.audioUnit completionBlock:^{ [_audioEngine stop]; }];
    }
    }
  2. Your app should continue to remain active in the background while connected. When you are disconnected, your app should suspend too. You can do this by observing the above property. Once the value is NO, stop your audio engine as appropriate:
    static void * kAudiobusConnectedChanged = &kAudiobusConnectedChanged;
    ...
    // Watch the connected property
    [self.audiobusController addObserver:self
    forKeyPath:@"connected"
    options:0
    context:kAudiobusConnectedChanged];
    ...
    -(void)observeValueForKeyPath:(NSString *)keyPath
    ofObject:(id)object
    change:(NSDictionary *)change
    context:(void *)context {
    if ( context == kAudiobusConnectedChanged ) {
    if ( [UIApplication sharedApplication].applicationState == UIApplicationStateBackground
    && !_audiobusController.connected ) {
    // Disconnected. Time to sleep.
    [_audioEngine stop];
    }
    } else {
    [super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
    }
    }
  3. When your app moves to the foreground, start your audio engine:
    -(void)applicationWillEnterForeground:(NSNotification *)notification {
    if ( !_audioEngine.running ) {
    // Start the audio system if it wasn't running
    [_audioEngine start];
    }
    }
  4. If you receive an audio session interruption, and your app is still connected to Audiobus, restart your audio system immediately.
    [[NSNotificationCenter defaultCenter] addObserverForName:AVAudioSessionInterruptionNotification object:nil queue:nil usingBlock:^(NSNotification *notification) {
    NSInteger type = [notification.userInfo[AVAudioSessionInterruptionTypeKey] integerValue];
    if ( type == AVAudioSessionInterruptionTypeBegan && !weakSelf.audiobusController.connected ) {
    [weakSelf stop];
    } else {
    [weakSelf start];
    }
    }]];

Note that during development, if you have not yet registered your app with Audiobus (Step 5), the Audiobus app will only be able to see your app while it is running. Consequently we strongly recommend registering your app before you begin testing.

Set up a Launch URL

Audiobus needs a URL (like YourApp-1.0.audiobus://) that can be used to launch and switch to your app, and used to determine if your app is installed.

The URL scheme needs to end in ".audiobus", to ensure that Audiobus app URLs are unique. This URL also needs to be unique to each version of your app, so Audiobus can tell each version apart, which is important when you add new Audiobus features.

Here's how to add the new URL scheme to your app.

  1. Open your app target screen within Xcode by selecting your project entry at the top of Xcode's Project Navigator, and selecting your app from under the "TARGETS" heading.
  2. Select the "Info" tab.
  3. Open the "URL types" group at the bottom.
  4. If you don't already have a URL type created, click the "Add" button at the bottom left. Then enter an identifier for the URL (a reverse DNS string that identifies your app, like "com.yourcompany.yourapp", will suffice).
  5. If you already have existing URL schemes defined for your app, add a comma and space (", ") after the last one in URL Schemes field (Note: the space after the comma is important).
  6. Now enter the new Audiobus URL scheme for your app, such as "YourApp-1.0.audiobus". Note that this is just the URL scheme component, not including the "://" characters).

Other apps will now be able to switch to your app by opening the YourApp-1.0.audiobus:// URL.

Add Network Privacy Info.plist Entries

As of iOS 14, any app that uses networking - which includes any app that uses the Audiobus SDK - must include a NSLocalNetworkUsageDescription entry within the Info.plist describing how the app uses networking, and a NSBonjourServices entry listing the Bonjour service types.

To add these values:

  1. Open your app target screen within Xcode by selecting your project entry at the top of Xcode's Project Navigator, and selecting your app from under the "TARGETS" heading.
  2. Select the "Info" tab.
  3. Select any entry in the list entitled "Custom iOS Target Properties", and press Enter to add a new row
  4. For the key, enter "NSLocalNetworkUsageDescription", make sure "String" is selected for the type, then for the value, enter something like "YourAppName uses networking to communicate with Audiobus and other music apps."
  5. Press Enter again to add a second row, and enter "NSBonjourServices" for the key. Select "Array" for the type, and then click the triangle to expand the list, and for Item 0 enter "_audiobus._udp" for the value.

4. Register Your App and Generate Your API Key

Audiobus contains an app registry which is used to enumerate Audiobus-compatible apps that are installed. This allows apps to be seen by Audiobus even if they are not actively running in the background. The registry also allows users to discover and purchase apps that support Audiobus.

Register your app, and receive an Audiobus API key, at the Audiobus app registration page.

You'll need to provide various details about your app, and you'll need to provide a copy of your compiled Info.plist from your compiled app bundle, which Audiobus will use to populate the required fields. You'll be able to edit all of these details up until the time you go live with your app.

You must provide the compiled version of your Info.plist, not the one from your project folder. You can find this by building your app, right-clicking on the app in the "Products" group of the Xcode project navigator, and clicking "Show in Finder", then right-clicking on the app bundle and selecting "Show Package Contents"

After you register, we will briefly review your application. Upon approval, you will be notified via email, which will include your Audiobus API key, and the app will be added to the Audiobus registry.

You can always look up your API key by visiting https://developer.audiob.us/apps and clicking on your app. The API key is at the top of the app details page.

The API key is a string that you provide when you use the Audiobus SDK. It is unique to each version of your app, and tied to your bundle name and launch URL. It will be checked by the SDK upon initialisation, to provide automatic error checking. No network connection is required to verify the key.

Note that while registering your app will not cause it to appear on our website or in the "Apps" tab in the app, it will cause it to appear within the XML feed that Audiobus downloads to keep track of which of the installed apps support Audiobus.

This will not cause your app to appear within Audiobus' app listings, because you chose a new, unique URL in Step 4, but a dedicated user with a packet sniffer may see your app in the XML stream. Additionally, while we do not make the URL to this feed public, the feed itself is publicly-accessible.

The Audiobus app downloads registry updates from our servers once every 30 minutes, so once we approve your submission, we recommend that you reinstall the Audiobus app to force it to update immediately, so you can begin working.

To make your app appear on the Audiobus website or in the in-app Compatible Apps directory, and therefore give Audiobus users the ability to purchase your app, you need to you make your app live (Step 10). Do this only when the Audiobus-compatible version of your app goes live on the App Store, so as not to confuse users.

As you develop your app further, beyond this initial integration of Audiobus, we recommend you register new versions of your app with us when you add new Audiobus functionality, like adding new ports or implementing features like state saving. This will both allow Audiobus to correctly advertise the new features in your new version, and will boost your sales when your app appears at the top of our compatible apps directly again. You can register new versions of your app by clicking "Add Version" on your app page.

5. Enable mixing audio with other apps

When you use audio on iOS, you typically select one of several audio session categories, usually either AVAudioSessionCategoryPlayAndRecord or AVAudioSessionCategoryPlayback.

By default, both of these categories will cause iOS to interrupt the audio session of any other app running at the time your app is started, forcing the other app to suspend.

If you are using either PlayAndRecord or MediaPlayback, then in order to use Audiobus you need to override this default, and tell iOS to allow other apps to run at the same time and mix the output of all running apps.

To do this, you need to set the AVAudioSessionCategoryOptionMixWithOthers flag, like so:

NSString *category = AVAudioSessionCategoryPlayAndRecord;
AVAudioSessionCategoryOptions options = AVAudioSessionCategoryOptionMixWithOthers;
NSError *error = nil;
if ( ![[AVAudioSession sharedInstance] setCategory:category withOptions:options error:&error] ) {
NSLog(@"Couldn't set audio session category: %@", error);
}

Note that with the old Audio Session C API, adjusting other session properties can interfere with this property setting, causing other apps to be interrupted despite the mix property being set. To avoid problems, we recommend only using the modern AVAudioSession API. If you do need to use the older C API though, be sure to reset the kAudioSessionProperty_OverrideCategoryMixWithOthers property value whenever you assign any audio session properties.

If you're interacting with the audio session (via AVAudioSession or the old C API), you must set the audio session category and "mix with others" flag before setting the audio session active. If you do this the other way around, you'll get some weird behaviour, like silent output when used with IAA.

6. Instantiate the Audiobus Controller

Next, you need to create a strong property for an instance of the Audiobus Controller. A convenient place to do this is in your app's delegate, or within your audio engine class.

First, import the Audiobus header from your class's implementation file:

#import "Audiobus.h"

Next declare a strong (retaining) property for the instance from within a class extension:

@interface MyAppDelegate ()
@property (strong, nonatomic) ABAudiobusController *audiobusController;
@end

Now you'll need to create an instance of the Audiobus controller. A convenient place to do this is in your app delegate's application:didFinishLaunchingWithOptions: method, or perhaps within your audio engine's initialiser, but there are three very important caveats:

First: you must either start your audio system at the same time as you initialise Audiobus, or you must watch for ABConnectionsChangedNotification and start your audio system when the ABConnectionsChangedNotification is received. This is because as soon as your app is connected via Audiobus, your app must have a running and active audio system, or a race condition may occur wherein your app may suspend in the background before an Audiobus connection has been completed.

Second: you must instantiate the Audiobus controller on the main thread only. If you do not, Audiobus will trigger an assertion.

Third: you must not hold up the main thread after initialising the Audiobus controller. Due to an issue in Apple's service browser code, if the main thread is blocked for more than a couple of seconds, Audiobus peer discovery will fail, causing your app to refuse to respond to the Audiobus app. If you need to take more than a second or two to initialise your app, initialise the Audiobus controller afterwards, or do that processing in a background thread.

You must initialise ABAudiobusController as close to app launch as is possible, and you must keep the instance around for the entire life of your app. If you release and create a new instance of ABAudiobusController, you will see some odd behaviour, such as your app failing to connect to Audiobus.

Create the ABAudiobusController instance, passing it the API key that you generated when you registered your app in Step 5:

self.audiobusController = [[ABAudiobusController alloc] initWithApiKey:@"YOUR-API-KEY"];

At certain times, Audiobus will display the Connection Panel within your app. This is a slim panel that appears at the side of the screen, that users can drag off the screen, and swipe from the edge of the screen to re-display, a bit like the iOS notification screen.

By default, the Connection Panel appears at the right of the screen. If this does not work well with your app's UI, you can select another location for the panel:

self.audiobusController.connectionPanelPosition = ABConnectionPanelPositionLeft;

You can change this value at any time (such as after significant user interface orientation changes), and Audiobus will automatically animate the panel to the new location.

If the connection panel is on the bottom of the screen, it cannot be hidden by the user. This is to avoid interference by the iOS Control Center panel.

7. Create Ports

Now you're ready to create your Audiobus ports.

You can make as many ports as you like. For example, a multi-track recorder could provide per-track outputs, or an effect app with side-chain processing could create a main effect port, and a sidechain port. We recommend being generous with your port offering, to enable maximum flexibility, such as per-track routing. Take a look at Loopy or Loopy HD for an example of the use of multiple ports.

If you plan to work with MIDI in your app, you may want to create some MIDI ports, as well; see Working With MIDI for more information.

Note that you should create all your ports when your app starts, regardless of whether you intend to use them straight away, or you'll get some weird behaviour. If you're not using them, just keep them silent (or inactive, by not calling the receive/send functions).

Due to some changes since iOS 9, we now strongly discourage you from creating apps that have only a receiver port (ABAudioReceiverPort). Such apps will not be able to be identified as installed by Audiobus on iOS 9 or later.

To repeat: you should create a sender port or a filter port, in addition to any receiver ports you require. If this is simply not an option, please contact us before proceeding.

This is a limitation enforced by security changes since iOS 9 that prohibit Audiobus from detecting installed apps unless they provide sender or filter ports.

If you are using the Audio Queue API in your app, unfortunately there is no convenient way to support sending audio via Audiobus. You can, however, support receiving from Audiobus.

Audio Sender Port

In almost all cases, you'll need to create an ABAudioSenderPort. Even if you don't intend your app to send audio, a sender port is required in order to detect your app, and to launch it.

The first sender port you define will be the one that Audiobus will connect to when the user taps your app in the port picker within Audiobus, so it's best to define the port with the most general, default behaviour first.

Sender ports can be hidden from within Audiobus, by ticking the "Hidden" checkbox by the port entry at the Audiobus Developer Center.

Firstly, you'll need to create an AudioComponents entry within your app's Info.plist. This identifies your port to other apps. If you have integrated Inter-App Audio separately, and you already have an AudioComponents entry, you can use these values with your ABAudioSenderPort without issue. Otherwise:

  1. Open your app target screen within Xcode by selecting your project entry at the top of Xcode's Project Navigator, and selecting your app from under the "TARGETS" heading.
  2. Select the "Info" tab.
  3. If you don't already have an "AudioComponents" group, then under the "Custom iOS Target Properties" group, right-click and select "Add Row", then name it "AudioComponents". Set the type to "Array" in the second column.
  4. Open up the "AudioComponents" group by clicking on the disclosure triangle, then right-click on "AudioComponents" and select "Add Row". Set the type of the row in the second column to "Dictionary". Now make sure the new row is selected, and open up the new group using its disclosure triangle.
  5. Create five different new rows, by pressing Enter to create a new row and editing its properties:
    • "manufacturer" (of type String): set this to any four-letter code that identifies you (like "abus")
    • "type" (of type String): set this to "aurg", which means that we are identifying a "Remote Generator" audio unit, or "auri" for a "Remote Instrument" unit which can receive MIDI.
    • "subtype" (of type String): set this to any four-letter code that identifies the port.
    • "name" (of type String): Apple recommend that you set this to "Manufacturer: App name" (see WWDC 2013 session 602, page 37). If you publish multiple ports, you will need to identify the particular port, too. We propose "Manufacturer: App name (Port name)". Note that this field does not need to match the name or title you pass to ABAudioSenderPort.
    • "version" (of type Number): set this to any integer (whole number) you like. "1" is a good place to start.

Once you're done, it should look something like this:

It's very important that you use a different AudioComponentDescription for each port (represented by the type, subtype and manufacturer fields). If you don't have a unique AudioComponentDescription per port, you'll get all sorts of Inter-App Audio errors (like error -66750 or -10879).

The Audiobus Developer Center will check that your AudioComponentDescriptions are unique among the Audiobus community, but this is not a guarantee of uniqueness among non-Audiobus apps.

If you intend to work with MIDI, you may wish to specify the Remote Instrument ('auri') type for your audio component. See Working With MIDI for more information.

If you create a port of Remote Instrument ('auri') you need to make sure that your port is not receiving twice, one time via Inter-App audio and a second time via Core MIDI. To ensure that you must assign a block to the enableReceivingCoreMIDIBlock property. Audiobus calls this block to tell your app exactly when to enable or disable receiving via Core MIDI. See here for more details.

If you wish to use more than one AudioComponentDescription to publish the port, to provide both Remote Generator and Remote Instrument types for example, you may provide the additional AudioComponentDescription to the sender port via ABAudioSenderPort's registerAdditionalAudioComponentDescription: method (you will need to call AudioOutputUnitPublish for the additional types yourself).

Now it's time to create an ABAudioSenderPort instance. You provide a port name, for internal use, and a port title which is displayed to the user. You can localise the port title.

You may choose to provide your IO audio unit (of type kAudioUnitSubType_RemoteIO), which will cause the sender port to automatically capture and send the audio output. This is the recommended, easiest, and most efficient approach. If you are using the C Core Audio API, this will be your main output unit. If you are using AVAudioEngine, you can access this via AVAudioOutputNode/AVAudioInputNode's "audioUnit" property.

Alternatively, if you're creating secondary ports, or have another good reason for not using your IO audio unit with the sender port at all, then you send audio by calling ABAudioSenderPortSend , then mute your audio output depending on the value of ABAudioSenderPortIsMuted .

ABAudioSenderPort when initialized without an audio unit will create and publish its own audio unit with the AudioComponentDescription you pass into the initializer. If you are planning on using ABAudioSenderPort without an audio unit (you're not passing an audio unit into the initializer), then you must not publish any other audio unit with the same AudioComponentDescription. Otherwise, two audio units will be published with the same AudioComponentDescription, which would be bad, and would result in unexpected behaviour like silent output.
If you're using ABAudioSenderPort without an audio unit for the purposes of offering a new, separate audio stream with a different AudioComponentDescription, though, you're fine.

If you are using a sender port and not initialising it with your audio unit, you must mute your app's corresponding audio output when needed, depending on the value of the ABAudioSenderPortIsMuted function. This is very important and both avoids doubling up the audio signal, and lets your app go silent when removed from Audiobus. See the Sender Port recipe and the AB Receiver sample app for details.

If you work with floating-point audio in your app we strongly recommend you restrict values to the range -1.0 to 1.0, as a courtesy to developers of downstream apps.

As of iOS 13, an app launched into the background is only permitted to begin recording if the Remote IO node is currently being hosted via Inter-App Audio. If your app requires recording abilities at launch, and you are using ABAudioSenderPort without an audio unit and just calling ABAudioSenderPortSend manually, then you should tick the box on the Audiobus app registration page labelled "Launch Manually". This will make Audiobus require a tap to launch your app into the foreground, where it will be allowed to begin recording, rather than automatically launching it into the background.

Finally, you need to pass in an AudioComponentDescription structure that contains the same details as the AudioComponents entry you added earlier.

self.audioSenderPort = [[ABAudioSenderPort alloc] initWithName:@"Audio Output"
title:NSLocalizedString(@"Main App Output", @"")
audioComponentDescription:(AudioComponentDescription) {
.componentType = kAudioUnitType_RemoteGenerator,
.componentSubType = 'subt', // Note single quotes
.componentManufacturer = 'manu' }
audioUnit:_audioUnit];
[self.audiobusController addAudioSenderPort:self.audioSenderPort];

If your sender port's audio audio comes from the system audio input (such as a microphone), then you should set the port's derivedFromLiveAudioSource property to YES to allow Audiobus to be able to warn users if they are in danger of creating audio feedback.

Please note that you should not split up stereo Audiobus streams into two separate channels, treated differently. You should always treat audio from Audiobus as one, 2-channel stream.

You may also optionally provide an icon (a 32x32 mask, with transparency) via the icon property, which is also displayed to the user and can change dynamically. We strongly recommend providing icons if you publish more than one port, so these can be recognized from one another. If you provide an icon here, you should also add that icon to the port on your app's registry on our developer site, so it can be displayed to users prior to your app being launched.

Audio Filter Port

If you intend to filter audio, to act as an audio effect, then create an ABAudioFilterPort.

This process is very similar to creating a sender port. You need to create an Info.plist AudioComponents entry for your port, this time using 'aurx' as the type, which identifies the port as a Remote Effect, or 'aurm' which identifies it as a Remote Music Effect capable of receiving MIDI.

If you create a port of Remote Instrument ('aurm') you need to make sure that your port is not receiving twice, one time via Inter-App audio and a second time via Core MIDI. To ensure that you must assign a block to the enableReceivingCoreMIDIBlock property. Audiobus calls this block to tell your app exactly when to enable or disable receiving via Core MIDI. See here for more details.

It's very important that you use a different AudioComponentDescription for each port (represented by the type, subtype and manufacturer fields). If you don't have a unique AudioComponentDescription per port, you'll get all sorts of Inter-App Audio errors (like error -66750 or -10879).

The Audiobus Developer Center will check that your AudioComponentDescriptions are unique among the Audiobus community, but this is not a guarantee of uniqueness among non-Audiobus apps.

Then you create an ABAudioFilterPort instance, passing in the port name, for internal use, and a title for display to the user.

Again, you may provide your IO audio unit (of type kAudioUnitSubType_RemoteIO, with input enabled), which will cause the filter to use your audio unit for processing. This is the easiest, most efficient and recommended approach. As mentioned above, if you are using the C Core Audio API, this will be your main output unit. If you are using AVAudioEngine, you can access this via AVAudioOutputNode/AVAudioInputNode's "audioUnit" property.

self.filter = [[ABAudioFilterPort alloc] initWithName:@"Main Effect"
title:@"Main Effect"
audioComponentDescription:(AudioComponentDescription) {
.componentType = kAudioUnitType_RemoteEffect,
.componentSubType = 'myfx',
.componentManufacturer = 'you!' }
audioUnit:_ioUnit];
[self.audiobusController addFilterPort:_filter];

Alternatively, if you have a good reason for not using your IO audio unit with the filter port, you can use ABAudioFilterPort's process block initializer . This allows you to pass in a block to use for audio processing.

self.filter = [[ABAudioFilterPort alloc] initWithName:@"Main Effect"
title:@"Main Effect"
audioComponentDescription:(AudioComponentDescription) {
.componentType = kAudioUnitType_RemoteEffect,
.componentSubType = 'myfx',
.componentManufacturer = 'you!' }
processBlock:^(AudioBufferList *audio, UInt32 frames, AudioTimeStamp *timestamp) {
// Process audio here
} processBlockSize:0];

Note that if you intend to use a process block instead of an audio unit, you are responsible for muting your app's normal audio output when the filter port is connected. See the Filter Port recipe for details.

ABAudioFilterPort, when initialized with a filter block (instead of an audio unit) will create and publish its own audio unit with the AudioComponentDescription you pass into the initializer. If you are planning on using ABAudioFilterPort with a process block, instead of an audio unit, then you must not publish any other audio unit with the same AudioComponentDescription. Otherwise, two audio units will be published with the same AudioComponentDescription, which would be bad, and would result in unexpected behaviour like silent output.
If you're using ABAudioFilterPort with a filter block for the purposes of offering a new, separate audio processing facility, separate from your published audio unit, and with a different AudioComponentDescription, though, you're fine.

You may also optionally provide an icon (a 32x32 mask, with transparency) via the icon property, which is also displayed to the user and can change dynamically. We strongly recommend providing icons if you publish more than one port, so these can be recognized from one another. If you provide an icon here, you should also add that icon to the port on your app's registry on our developer site, so it can be displayed to users prior to your app being launched.

Audio Receiver Port

If you intend to receive audio, then you create an ABAudioReceiverPort.

ABAudioReceiverPort works slightly differently to ABAudioSenderPort and ABAudioFilterPort: it does not use an audio unit, nor does it require an AudioComponentDescription. Instead, you call ABAudioReceiverPortReceive to receive audio.

First, create the receiver, and store it so you can use it to receive audio:

@property (nonatomic, strong) ABAudioReceiverPort *receiverPort;
self.receiverPort = [[ABAudioReceiverPort alloc] initWithName:@"Audio Input"
title:NSLocalizedString(@"Main App Input", @"")];
[self.audiobusController addReceiverPort:_receiverPort];

Now set up the port's clientFormat property to whatever PCM AudioStreamBasicDescription you are using (such as non-interleaved stereo floating-point PCM):

AudioStreamBasicDescription audioDescription = {
.mFormatID = kAudioFormatLinearPCM,
.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked | kAudioFormatFlagIsNonInterleaved,
.mChannelsPerFrame = 2,
.mBytesPerPacket = sizeof(float),
.mFramesPerPacket = 1,
.mBytesPerFrame = sizeof(float),
.mBitsPerChannel = 8 * sizeof(float),
.mSampleRate = 44100.0
};
_receiverPort.clientFormat = audioDescription;

Now you may receive audio using ABAudioReceiverPortReceive , in a similar fashion to calling AudioUnitRender on an audio unit. For example, within a Remote iO input callback, you might write:

AudioTimeStamp timestamp = *inTimeStamp;
if ( ABAudioReceiverPortIsConnected(self->_receiverPort) ) {
// Receive audio from Audiobus, if connected. Note that we also fetch the timestamp here, which is
// useful for latency compensation, where appropriate.
ABAudioReceiverPortReceive(self->_receiverPort, nil, ioData, inNumberFrames, &timestamp);
} else {
// Receive audio from system input otherwise
AudioUnitRender(self->_audioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData);
}

Just as with AudioUnitRender, it's important to continually call ABAudioReceiverPortReceive once ABAudioReceiverPortIsConnected returns YES, even if you're not currently using the returned audio. If you don't do this, your app will not work correctly.

The receiver port assumes you provide monitoring - where you pass the incoming audio to the system output so the user can hear it. If you do not do so, the user won't be able to hear any apps that send audio to your app. If that's the case, ABAudioReceiverPort provides an automatic monitoring facility for you: just set automaticMonitoring to YES to use it.

See The Receiver Port or the Receiver Port recipe for more info on receiving.

You may also optionally provide an icon (a 32x32 mask, with transparency) via the icon property, which is also displayed to the user and can change dynamically. We strongly recommend providing icons if you publish more than one port, so these can be recognized from one another. If you provide an icon here, you should also add that icon to the port on your app's registry on our developer site, so it can be displayed to users prior to your app being launched.

If you wish to receive multi-channel audio, with one audio stream for each connected app, see the section on receiving separate streams.

As of iOS 14, it's recommended that you perform one extra configuration step, which will allow the Audiobus SDK to defer starting networking - and defer the local network permissions prompt - until your app is connected. See the documentation for the startNetworkCommunications method for details.

MIDI Ports

If you intend to work with MIDI in your app, you may wish to create some MIDI sender, filter or receiver ports as well. See Working With MIDI for more information.

Note that if you only wish to respond to MIDI messages to generate audio, then you do not need to create a MIDI receiver port: you just need to specify a type of Remote Instrument ('auri') when creating an ABAudioSenderPort, and respond to MIDI messages via Inter-App Audio's kAudioOutputUnitProperty_MIDICallbacks mechanism. Audiobus will do the rest.

Update the Audiobus Registry

Once you've set up your ports, open your app page on the Audiobus Developer Center and fill in any missing port details.

We strongly recommend that you drop your compiled Info.plist into the indicated area in order to automatically populate the fields:

  1. This is much faster than putting them in yourself.
  2. This will ensure the details are free of errors, which could otherwise cause some "Port Unavailable" errors to be seen.
  3. This checks that you're not using AudioComponent fields that are already in use in another app, which would cause problems.

Filling in the port details here allows all of your app's ports to be seen within Audiobus prior to your app being launched.

It's important that you fill in the "Ports" section correctly, matching the values you are using with your instances of the ABSender, ABFilter and ABReceiver ports. If you don't do this correctly, you will see "Port Unavailable" messages within Audiobus when trying to use your app.

Once you have updated and saved the port information you will get a new API key. Copy this API key to your application. On the next launch the port configuration in your app is compared to the port information encoded in the API key. If mismatches are detected detailed error messages are printed to the console. Check this to find out if anything is wrong with your port registration.

8. Show and hide Inter-App Audio Transport Panel

If your app shows an Inter-App Audio transport panel, you will need to hide it while participating in an Audiobus session. To do so, assign a block to the property showInterAppAudioTransportPanelBlock of your ABAudiobusController instance. Within the block you need to show or hide your Inter-App Audio Transport panel accordingly:

_audiobusController.showInterAppAudioTransportPanelBlock = ^(BOOL showIAAPanel) {
if ( showIAAPanel ) {
// TODO: Show Inter-App Audio Transport Panel
} else {
// TODO: Hide Inter-App Audio Transport Panel
}
};

9. If your app is an IAA host, do not show Audiobus' hidden sender ports

Audiobus provides a number of intermediate sender ports. These ports are only used internally by the Audiobus SDK. If your app is an IAA host (like a multitrack recorder or any sort of recording app) you should hide these ports in the list of available Inter-App audio nodes. To check if an audio component description belongs to a hidden Audiobus sender port, you can use the following function declared in ABCommon.h:

BOOL ABIsHiddenAudiobusPort(AudioComponentDescription audioComponentDescription);

Here is a code fragment showing how an Inter-App audio port list can be generated that does not contain the hidden Audiobus intermediate ports:

- (void) refreshAUList {
[_publishedInstruments removeAllObjects];
AudioComponentDescription searchDesc = { 0, 0, 0, 0, 0 };
AudioComponent comp = NULL;
while (true) {
comp = AudioComponentFindNext(comp, &searchDesc);
if (comp == NULL) break;
AudioComponentDescription desc;
if (AudioComponentGetDescription(comp, &desc)) continue;
//Ignore hidden Audiobus Inter-App Audio nodes
if(ABIsHiddenAudiobusPort(desc)) continue;
//Fill list of other Inter-App audio nodes
if (desc.componentType == kAudioUnitType_RemoteInstrument ||
desc.componentType == kAudioUnitType_RemoteGenerator ) {
RemoteAU *rau = [[RemoteAU alloc] init];
rau->_desc = desc;
rau->_comp = comp;
rau->_image = [AudioComponentGetIcon(comp, 32) retain];
AudioComponentCopyName(comp, (CFStringRef *)&rau->_name);
[_publishedInstruments addObject: rau];
}
}
}

10. Test

To test your app with Audiobus, you'll need the Audiobus app (https://audiob.us/download).

You'll find a number of fully-functional sample apps in the "Samples" folder of the Audiobus SDK distribution. Use these to test your app with, along with other Audiobus-compatible apps you may own.

We reserve the right to ban your app from the Compatible Apps listing or even from Audiobus entirely, if it does not work correctly with Audiobus. It's critical that you test your app properly.

11. Go Live

Before you submit your app to the App Store, please ensure the details of your registration at the apps page are correct. If not, users may experience unexpected behaviour. The Audiobus app caches the local copy of the registration for 30 minutes, so if you make any fixes to your app registration after going live, some users may not see the fix for up to 30 minutes.

Once the Audiobus-compatible version of your app has been approved by Apple and hits the App Store, you should visit the apps page and click "Go Live".

This will result in your app being added to the Compatible Applications listing within Audiobus, and shown on Audiobus's website in various locations. We will also include your app in our daily app mailing list, and if anyone has subscribed at our compatible apps listing to be notified specifically when your app gains Audiobus support, they will be notified by email.

If you forget this step, potential new users will never find your app through our app directories, losing you sales!

You're Done!

Unless you want to do more advanced stuff, that's it, you're done. Run your app, open the Audiobus app, and you should see your app appear in the appropriate port picker in the Audiobus app, depending on the ports you created.

Congratulations! You are now Audiobus compatible.

When it's time to update your app with new Audiobus functionality (such as a new Audiobus SDK version, or a new Audiobus-specific feature, like State Saving or the addition of more ports, be sure to register your new version from your app registration on developer.audiob.us. Once you go live with the new version, this will move your app to the top of the Audiobus Compatible Apps directory, resulting in increased exposure. > Note that you should never register your app twice: if you update your app, register a new version for your existing app registration.

The next thing to do is read the important notes on Being a Good Citizen to make sure your app behaves nicely with others. In particular, if your app records audio, it's important to make correct use of audio timestamps so Audiobus's latency compensation works properly in your app and those your app connects to.

If you are interested in handling MIDI in your app, read Working With MIDI for more information.

If your app provides both an ABAudioSenderPort and an ABAudioReceiverPort, you may wish to allow users to connect your app's output back to its input. If your app supports this kind of functionality, you can set the allowsConnectionsToSelf property to YES, and select the "Allows Connections To Self" checkbox on the app details page at developer.audiob.us, once you've ensured that your app doesn't exhibit feedback issues in this configuration. See the documentation for ABAudioSenderPortIsConnectedToSelf /ABAudioReceiverPortIsConnectedToSelf for discussion, and the AB Receiver sample app for a demonstration.

If you'd like to make your app more interactive, you can implement triggers that allow users to trigger actions in your app (like toggling recording, playback, etc) from other apps and devices.

If your app has a clock, we'd recommend looking into Ableton Link for synchronization with other apps. The Audiobus SDK automatically supports Link, and will enable it within your app when it's connected to Audiobus. There's nothing you need to do but include the Link SDK within your app.

The Audiobus app has a "Diagnostic Mode" setting in its System Preferences section. Developer mode makes Audiobus print additional information to the console. This can be helpful in case you're encountering problems with Audiobus.

Finally, tell your users that you support Audiobus! We provide a set of graphical resources you can use on your site and in other promotional material. Take a look at the resources page for the details.

Read on if you want to know about more advanced uses of Audiobus, such as MIDI, multi-track receiving, triggers, or state saving.