Wednesday, April 18, 2012
News from the trenches of augmented instrument design
Hey folks, out here in the perimeter, things are coming together. I purchased an Arduino and an ultrasonic rangefinder and I hope to have them talking to each other soon. Meanwhile, I’ve been exercising my iOS development chops in a few different ways.
First, working with libpd, I’ve built a rudimentary iOS music application that sends OSC messages to my computer. This is a simple app that I made to see how hard it would to develop an iOS app to take care of the OSC messaging coming from the meta-trombone. The good news: libpd is awesome and quite easy to integrate into an iOS development project.
If you’ve been living under a rock or you’ve somehow missed the Pure Data renaissance, libpd allows developers to embed a pd patch within an application. I can create all the MIDI/OSC and audio elements graphically within pd and embed that within an iOS app. All I need to do is make the user interface send messages to the pd patch (and back the other way as required). This significantly reduces the learning curve for creating music apps for iOS.
This brings us to my second iOS project. I’ve been designing a game that sends OSC messages based on in-game events. The user can setup the game levels and specify what OSC message is sent for each in-game event. For instances, if two objects collide, a note is played. Likewise the position of an object on screen can be mapped to parameters of an effect. I don’t want to give away too many details on this one yet, so stay tuned for details (and let me know if you want to beta test).
My third iOS project sends and receives signals to and from the Arduino through the RedPark serial cable. I’m quite excited about what this makes possible… any sensors and all the world of physical computing over which the arduino reigns can be incorporated into an iOS app. If you’ve been paying attention, combining that with libpd allows for a very compelling array of possibilities. Here’s the picture: the arduino handles the sensors and sends signals to your iDevice, which is used for display, input, networking, audio generation, playback and DSP.
From the start, I intended to use an iTouch in my meta-trombone project. At first I thought I would use it as a heads-up-display and as an input device to select patches using touchOSC. It’s becoming clear that it will do a bit more work… my current approach is to use the arduino to determine slide position and trombone notes. The iTouch will receive this information through the RedPark cable and send OSC signals to my MacBook based on the patch selected through the user interface of the software that it is running. The iTouch will be mounted on the trombone close to my eyes. I will navigate the user interface with a footswitch, with a key/joystick in my left hand and/or with its touch interface.
The iTouch will also display algorithmically generated notation based on the last couple notes I played (and/or previously composed fragments). I find the idea of using notation interesting, since, for the meta-trombone, traditional notation will not only describe musical motives, but parameter changes as well.