Sie sind auf Seite 1von 5

Open Music App Collaboration Manifesto

Author: Rolf Whrmann (info@temporubato.com) Date: 21st Aug 2011 Version: 1.0

This is a doc about how iOS apps should use MIDI while running on same device. It provides a set of best practices which should make user experience as great as possible for people who want run apps in parallel like in these scenarios: - A controller app (like SoundPrism, Polychord etc.) in front plays a sound generating app eg. a synth (like NLogSynth etc.) running in the background - Two beat-oriented apps eg. a drum machine (like MoDrum, Molten, Funk Box etc.) is running in sync with another app (like NLogSynth arpeggio or BassLine etc.) - A sequencer app controls other sound generating apps like synths, drum machines etc. - An external MIDI controller plays a synth app running in the background while the iOS interface is used for an app in the front triggering loops - Any combination of these scenarios: A sequencer app controls a drum box & arpeggiated synth in the background while the user plays controller app controlling another synth in the background. Here is an example: http://youtu.be/uksnTwaIrxk CPU & RAM are the only limits! Most best practices described here are neither rocket science nor my inventions. Other app makers like Synthetic Bits, Finger, One Red Dog Media etc. already did a great job with MIDI Sync. Audanika and myself from Tempo Rubato just did the controller plus synth model. So, I just tried to summarize what needs to be done to have a great user experience. It is a critical mass to let it rock for users. Surely more can be done. Must Haves (Points 1-7) 1. Implement Background Audio (Sound Generating Apps) If you are a sound generating app, implement background audio that that app is capable to run its audio engine in the background. iOS makes this very, very easy and it s well described in the iOS docs. - Keep care not to be depended on UI input

- Do not make calls to the UI thread - Have a preferences setting to enable background audio - Warn the user when enabling background audio the first time, that this is costly on battery resources, so that it should be activated only when needed - Optionally terminate background audio after x minutes of not receiving any MIDI event All you have to do in iOS is to set UIBackgroundModes to 'audio' in Info.plist, use an appropriate AudioSession category and adhere to the practices Apple recommends for background audio. 2. Switch off your Audio Engine (Controller Apps) Controller apps have sometimes also an own audio engine. They should have an option to switch it off when used to control another app. Do not just set internally volume to zero since it still utilizes important CPU resources for nothing. Disable your audio engine fully. 3. Implement Core MIDI Depending on the type of your app, integrate Core MIDI to send rsp. receive notes, MIDI sync etc. Apple's docs about Core MIDI are a bit rudimentary, but some people have provided beginners guides like Synthetic Bits: http://syntheticbits.com/blog/?p=508 4. Check for new MIDI Devices while Running Do not just check for devices at app startup. Implement a callback which is called from Core MIDI whenever the device setup is changed. Otherwise the user have to manually kill and restart your app when a device was changed. Here is what I do in NLog:
ret = MIDIClientCreate(CFSTR("NLogSynth MIDI Client"), NLogMIDIStateChangedHander, self, &client); if (ret) NLogLog([NSError errorWithDomain:NSMachErrorDomain code:ret userInfo:nil]);

This hook is called when something important happens:


static void NLogMIDIStateChangedHander(const MIDINotification *message, void *refCon) { if (message->messageID == kMIDIMsgSetupChanged) { CoreMidiHandler *coreMidi = (CoreMidiHandler *)refCon; [coreMidi connectSources];

[coreMidi connectDestinations]; }

You have to be careful about the fact that this hook may be called from what ever system thread. 5. Implement Virtual MIDI Ports For sound generating apps it is vital to declare own virtual MIDI ports that other apps can send MIDI data to and do specific routings. Here is what I do in NLog:
// Make virtual input & output ret = MIDIDestinationCreate(client, (CFStringRef)name, NLogMIDIReadProc, self, &virtInput); ret = MIDIObjectSetIntegerProperty(virtInput, kMIDIPropertyUniqueID, NLOG_VIRT_INPUT_ID); ret = MIDISourceCreate(client, (CFStringRef)name, &virtOutput); ret = MIDIObjectSetIntegerProperty(virtOutput, kMIDIPropertyUniqueID, NLOG_VIRT_OUTPUT_ID);

Keep care that the Core MIDI naming for virtual ports is reversed: MIDIDestinationCreate means input port from your app perspective and vice versa. For virtual inputs you can use same read proc as for normal MIDI sources, for virtual outputs you have to use MIDIReceive instead of MIDISend Again, Apple reversed naming perspective here. 6. Implement MIDI Sync (Beat Oriented Apps) Beat oriented apps like drum boxes, sequencer, synths with arpeggios and generative music apps need to synchronize beat and tempo. MIDI sync is used for this since many years successfully. It is not trivial and there are different approaches to do it, but here is a nice tutorial again from SyntheticBits to start with: http://syntheticbits.com/blog/?p=1213 7. Do not Waste System Resources As mentioned already in point 2, you have to be careful with system resources esp. CPU, RAM and connecting to Audio & MIDI devices. If you have a high profile audio engine, think about providing some degradation techniques like it is always better to reduce polyphony than to let the user experience clicks. For sure, there may be a point where your app is not getting enough resources to do a meaningful job. If this is the case, let the user know instead of producing clicks. I know very well that this is tricky and we need some more experience with iOS here, but we should have this in our minds for the time coming.

Don't stay basic, make it better 8. Instead of connecting to all available devices, let the user make the selection In a two app scenario it may be not vital, but it soon becomes a mess when people are using three or more apps. Do not think that this will not happen. In fact, it will happen immediately. I did it my self after 5 mins of playing NLog from SoundPrism I put MoDrums in the background to have a nice drum loop. 9. Implement Channel Mode instead of just Omni Mode In addition to device selection give the user a possibility to specify the MIDI channel your app is receiving rsp. sending. Same reasons apply like point 4, but esp. when using external hardware the external MIDI interface is typically represented as one device. With channel selection different apps can route via the same hardware interface to different gear connected to this interface. 10. Implement MIDI Volume & Pan Control If users have multiple sound generating apps running they need to balance the audio mix. Switching between all apps is cumbersome. If every sound generating app implements MIDI CC Volume & Pan, there is an option for a mixing app (or even a connected hardware mixing controller) to do the mix. Future Things for Discussion ======================== There might be more between apps. So, here are some topics for discussions. I started a Google Group here, which can be used for this: http://groups.google.com/group/open-music-app-collaboration 11. Saving & Loading Projects If you use multiple apps you may need to coordinate that all apps are need to save their data. iOS allows each app only to save in their own sandbox. There might be different solutions for this: - Use a common project name and send all involved apps a specific MIDI SysEx to save/load the named project from their own sandbox - OR let a central app collect project data from all involved apps via MIDI SysEx and save it locally in that central app. For loading again send data via SysEx or use openUrl for each app

12. Initiate MIDI Start & Stop from non-Master App If you use a controller and a synth for live performance and have another two beatoriented apps synched or just a drum box in the background, you may want to start / stop the beat-oriented apps from the live performance app without switching. - MIDI messages could be used like in MIDI remote transport - OR specific MIDI SysEx 13. Shuffle Settings Shuffle settings may need to be coordinated between multiple beat-oriented apps. - Anything defined in MIDI yet? - OR specific MIDI SysEx 14. Pattern Changes Many beat oriented app uses patterns to differentiate different parts and arrangements. Like in point 12 there is a need to synchronously switch patterns without leaving a live performance app. - Anything defined in MIDI yet? - OR specific MIDI SysEx 15. Virtual Audio Apps may want send audio to each other. This needs to be further explored what iOS provides rsp. allows. - Anything in Core Audio already? - OR agree on common protocol and send audio via inter-app socket connections 16. Maybe You have Ten-thousand more Ideas If not, users will have soon ;-) Any feedback is highly welcome! Just use the Google Group (http://groups.google.com/group/open-music-app-collaboration) or email in info@temporubato.com

Das könnte Ihnen auch gefallen