A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.
-Robert A. Heinlein
Monday, October 29, 2012
Live Recording from Y2KX+2
Friday, October 26, 2012
Y2KX+2 Livelooping Festival
Last week I had the privilege to perform at the 12th annual Y2K Livelooping Festival in California. This festival is, by nature and design, as eclectic and wonderful as organizer Rick Walker. At times, it seemed performers shared nothing but an attentive audience and an interest in using the techno-musical wizardry of livelooping.
Among the 50+ excellent artists I had a chance to hear, there are a few that stood out for me:
- Luca Formentini (Italy) played a wonderful set of ambient guitar in San Jose and I really connected with his approach to improvised music.
- Emmanuel Reveneau (France) had an amazing set in Santa Cruz. For this second of two sets at the festival, I felt that Emmanuel had soaked up a lot whatever was in the Santa Cruz air that week and he let it influence his music, His loop slicing was especially inspired... I can't wait for the release of the software he made with his computer-savvy partner.
- Hideki Nakanishi a.k.a Mandoman (Japan) gets an unbelievable sound out of a mandolin he built himself.
- John Connell only used an iPhone and a DL4 for his set. This minimalist approach really worked for him and it reminded me that the simple option is often the best option. I hope he'll check out the soon to be released Audiobus app as it will open up some possibilities for his music.
- Amy X Neuburg is one of my favourite loopers. I have an insatiable appetite for her unique combination of musicality and humour. Unfortunately I was setting up during her set and I couldn't give her music my full attention.
- Moe! Staiano played a great set for percussion instruments such as the electric guitar.
- Bill Walker played a laid back and masterful set of lap steel looping.
- Laurie Amat's birthday set (with Rick Walker) was simply the most appropriate way to end the festival.
- Shannon Hayden: Remember that name (you'll be hearing her music in your favourite TV shows soon enough).
The collegiality among the performers was a high point of my participation in this festival. I had the occasion to enjoy discussing the philosophical aspect of improvised experimental music with Luca, sharing notes on the business side of music with Shannon, listening to Laurie tells us about her collaboration with Max Mathews, witnessing technical demonstrations from Emmanuel, Bill and Rick, and listening to my housemate Paul Haslem practice on hammered dulcimer.
The Test of Performance
Personally, my participation in the festival was an opportunity to put my meta-trombone project to the test of performance. As with any new performance system, there were both positive and negative points to these first two maiden voyages. Encouragingly, I was quite satisfied with the varied timbres I could produce with the meta-trombone. I also enjoyed the drone-like feel of some of the loops and I liked the hypnotic phasing I employed.
However, not everything went well. My software crashed midway through my performance in Santa Cruz and I was forced to restart it. Thankfully, this is something I had practiced and I was able to keep playing acoustically on the trombone while the software came back online. It did not take very long and many people told me they did not even notice the crash…
More problematic, as I listen to the recorded performances, I feel there is something missing. I find the conceptual aspects of the meta-trombone quite stimulating, however conceptually interesting music does not necessarily translate to good music (music people want to hear). I tend to get overly interested in the conceptual part, but I need to focus on the music now that the concepts are firmly in place.
I talked it over with other performers: Emanuel suggested I form a trio with a bassist and a drummer so that I could rely on them to anchor the narrative aspects; Luca thought I needed to think more about my transitions. Both suggestions will need to be explored as I continue work on the meta-trombone.
Next Steps
I'm currently editing the recordings of my two performances into accessible short 'songs' for easy consumption. While the meta-trombone still requires work, I feel that this point in its development is still worthy of documentation and I stand by the recordings I made in California.
One of the first things I want to develop further is role of the notation metaphor in the meta-trombone. Currently, trombone performance is interpreted by the computer software and the notes that I play execute code (specifically Mobius scripts). I would like to expand this by creating algorithms that will send notation to be displayed on my iPod touch based on what notes were previously played. Since meta-trombone notes serve both as musical material and as control signals, the software will be able to suggest changes to either the music or the system states by displaying music notation. I already have a working app that displays music notation on iOS in real-time through OSC and it is generating quite a bit of buzz. I'll have to integrate it into a performance software for iOS that will ultimately replace TouschOSC, which I currently use as my heads-up display (see photo above).
Another avenue for further exploration would be to diversify the computer code that can be executed by playing notes. I have a couple ideas for this and I think I will turn to Common Music to help implement them. Out of the box, Common Music can respond to a MIDI note-on message by executing a scheme procedure, so it will be easy to integrate into my existing system.
I'm also looking to perform more with the meta-trombone and I'm actively looking for playing opportunities. There's a possible gig in New York City in mid April (2013), so if anyone can help me find anything else around that time in NYC, it would make it a worthwhile trip.
Friday, July 27, 2012
Meta-Trombone Revisited
The recent release of version 2.0 of Mobius has spurred me to redesign my meta-trombone Bidule patch. Since I can have both the new and the old version in the same patch, my matrix mixer (and some of the most complex patching) can be eliminated by using both versions of the looper.
The first looper will be the one that is “played” by trombone notes. This is what I mean by playing the looper:
- trombone notes will trigger the loop playback from a position determined by the note value
- and/or trombone notes will change the playback rate relative to the note played
- and the amplitude of the loop will follow the trombone performance by using an envelope follower.
I’ll have a second instance of Mobius down the line that will resample the output of the first looper in addition to (or in the absence of) any incoming audio. Effects will be applied after audio in, after the envelope follower and after the resampling looper. I’ve yet to determine exactly what those effects will be, but the success of my vocal set patch leads me to consider a rather minimalist approach.
Speaking of minimalism, I’ve been listening to a lot of Steve Reich these days and I’d like to incorporate some phasing pattern play into my set for my upcoming performance at this year’s Y2K festival. One way to quickly create some interesting phasing composition is to capture a loop to several tracks at once and then trim some of the tracks by a predetermined amount. This can be easily accomplished with a script and I’ve been toying with some ideas along those lines.
Something else to which I’ve given some consideration is the development of midi effects to insert on the midi notes interpreted from the trombone performance. Some midi effects that would be easy to implement:
- midi note delay;
- arpeggiator;
- remapper (to specific key signature);
- transposer.
It will be interesting to see what impact these effects will have on the loop playback of the first looper. Another idea is to remap notes to parameter selection or note velocity to parameter value.
Another significant change is that I’ve acquired hardware to interpret midi notes from trombone performance. I’ve decided to go with the Sonuus I2M instead of my previously discussed approach mainly because I was wasting too much time try to make the ultrasonic sensor work properly. Bottom line, it wasn’t that interesting and I’d rather be playing music. My current plan is to use a contact microphone to feed audio to the I2M and to have a gate on the midi notes it generates in Bidule that I’ll activate with a footswitch.
I’ll also be designing performance software for the iOS as I intend to attach an iPod touch to the trombone to serve as my heads-up display for various system states (updated wirelessly with OSC). I’ll be controlling the iPod with a Bluetooth page-turning footswitch. One pedal on the footswitch will change between different screens and the other pedal will activate an action available on that screen. For instance, on the notation screen, pressing the action pedal will display a new melodic line (either algorithmically generated or randomly chosen from previously composed fragments).
Now all I have to do is build it (and they will come… or more accurately, I will go to them).
Thursday, July 12, 2012
Bring a map
Before beginning this particular mapping, I had a vision I wanted to instantiate. I wanted a system that would allow me to quickly create complex and ever evolving loops using only words and other vocal sounds. I also wanted to limit myself to musique concrete manipulations: Loops, cutting and splicing sounds, delay, pitch shifting and reverb.
This is the audio flow I came up with:
Incoming audio is sent to outputs and also split to four tracks on a multitrack looper. Before reaching the looper, each signal path goes through a pitch shifting effect. Each track then goes to its own post-looper effect. Tracks 1 and 2 go to a delay while Tracks 3 and 4 go to a reverb. Those two groups of tracks are mixed together and the result is sent to a crossfader than selects between these two sources. The output of the crossfader is mixed with the audio input and sent out.
My looper is Mobius. I could’ve used another looper for this project, but my familiarity with this software and ease of implementation won out over wanting to play with another looper (I’ve had my eye on Augustus for a while).
My pitch shifter is Pitchwheel. It’s a pretty interesting plugin that can be used on its own to create some interest in otherwise static loops. Here, I’m only using it to shift the incoming audio, so it’s a pretty straightforward scenario.
My reverb is Eos by Audio Damage. Do you know of a better sounding reverb that is also CPU friendly? I can’t think of any. My delay in this project is also by Audio Damage. I’m using their Discord3 effect that combines a nice delay with some pitch shifting and filtering with an LFO thrown in to modulate everything. This effect can really make things sound weird, but I’ll be using more subtle patches for this project.
To control all of this, I’ll be using my trusty Trigger Finger to control the looper and my Novation Nocturn to control the effects. Here’s what I decided to do for looper control:
Starting on the left, the faders will control the volume of my tracks in Mobius. The pads and rotary dials on the right are grouped by column and correspond to tracks 1 to 4. Each button perform the same function, but on different tracks. The bottom row of pads call the sustain substitute function on a given track. The row immediately above it does the same thing, but will also turn off the incoming audio, so it will act like my erase button (with secondary feedback determining how much the sounds will be attenuated). The next row up sends the track playing backwards for as long as the button is pressed and the final row of buttons mutes a given track. The first rotary dial controls the playback rate of a given tracks and the top one controls its secondary feedback setting.
To control the effects, this is the mapping I came up with for the Nocturn:
The crossfader is obviously used to control the crossfader between the two track groups. After that, each track has two knobs: one that controls the amount of pitch shift to audio coming in to the track and another that controls the wet/dry parameters of the tracks post-looper effect. The pads on the bottom will select different plugin patches, but the last one on the right is used to reset everything and prepare for performance. Among other things, it will create an empty loop of a specified length in Mobius, which is needed before I can begin using the sustain substitute function. Essentially, I’ll be replacing the silence of the original loop with the incoming audio.
One thing I won’t be doing is tweaking knobs and controlling every single parameter of my plugins. I’ll rely on a few well-chosen and specifically created patches instead. Also, keeping the effects parameters static can be an interesting performance strategy. When I heard Mia Zabelka perform on violin and body sounds last year at the FIMAV, one thing that struck me was that she played her entire set through a delay effect without once modifying any of its parameters. The same delay time and the same feedback throughout. For me, this created a sense of a world in which her sounds existed or a canvas on which her work lived. It’s like she changed a part of the physical reality of the world and it made it easier to be involved in her performance because I could predict what would happen. Just as I can instinctively predict the movement of a bouncing ball in my everyday universe, I became able to predict the movements of sound within the universe she created for us with her performance.
Here's a recording I made tonight by fooling around with this setup:
Tuesday, May 29, 2012
New Album Release: sans jamais ni demain
the longing for repetition
“Happiness is the longing for repetition.”This is a song I made for CT-One Minute. All sounds are derived from a 10-second bass clarinet phrase sample that can be downloaded freely from the Philharmonia Orchestra's website. The sample was played back at various playback rates, forward and backward, through various envelopes using the Samplewiz sampler on my iPod. This performance was recorded in one take with all looping and effects done in samplewiz. No further editing or effects except for copy and pasting the beginning at the end to bring closure to the piece.
---Milan Kundera
I approached samplewiz as a livelooper, since, in "note hold" mode, every note on the keyboard can be seen as a track on a multi-track looper (each with a different playback rate). For this piece, I used the forward and backwards loop settings in the wave window, so things get go sound a bit different. I added some delay and messed with the envelope and it started to sound nice. Once I had a good bed of asynchronous loops, I left "note hold" by tapping rather than swiping the control box (this kept the held notes). I then changed the settings around and played over the loops without "overdubbing".
Samplewiz is quite powerful... You can also change the loop start and end points in between notes to add variety, without affecting the notes that are already being held.
tutus de chemin
This is the soundtrack for a short film I made in a weekend with my wife. I started with a vocal recording of my wife that I sent through Paul's Extreme Sound Stretch. The resulting audio file was played back as a looop in Bidule. I sent the audio to a pitch shifting plug-in (I believe I was using PitchWheel at the time) and then to a midi gate group and finally to the Mobius looper. I performed the sounds two-handed on my Trigger Finger. One hand was controlling a fader that was assigned to pitch shifting and the other was triggering pads to control the midi gate (the note envelope) and various functions in Mobius.
Three of a kind
This piece started out as an assignment for a course in Electroacoustic composition I took at SFU a few years ago. The sound source was a homemade instrument, but everything was mangled and cut-up. This piece features heavy use of the short loop fabrication technique familiar to readers of this blog. I used Acid to assemble everything and add some effect automation throughout the piece.
le train
This is the soundtrack to a short animation film I made last year. I used Soundgrain to isolate parts of the original sound's spectrum and used that software to create loops that I mixed while recording. I think this musical cutup is well-matched with the visual cutup it was meant to accompany.
Game music
This songs was made using my soon to be released iOS app: BreakOSC! This app is a game that sends OSC messages based on in-game events. In this case, when the ball hit blue and green bricks, Bidule triggered four instances of iZotope's Iris. The paddle served as a cross-fader and mixed all those sounds together. The results were sent to a generous amount of reverb courtesy of Audio Damage's Eos.
sans jamais ni demain
Another composition I made for the aforementionned course in electroacoustic composiiton I took at SFU. The only sound source for this piece is a recording of myself reading an old poem I wrote in high-school. The slow moving textures were made by isolating parts of those words, slowing them down and layering them over each other to create very long notes of varying pitch that fade in and out over time. The more rythmic stuff I made using a now familiar technique.
July 8 2011
This piece is a recording of a live (from my basement) performance of what will one day become my meta-trombone. A short loop is created at the top (what is heard twice in the beginning) and then altered in different ways determined by trombone performance.
Twice through the looking glass
This song was also made for CT-One Minute using the exact same sound source as the longing of repetition. This time, however, I used Iris to change the character of the sound and created two different sound patches. I made two recordings with each of these patches by triggering the sounds with my new Pulse controller. My three-month old daughter also took part by adding her own surface hitting contributions, making this our first father-daugther collaboration. Once I had made these two recordings, I brought them in Bidule and placed them into Audio file players. The amplitude of output of each player was controlled via faders on my Trigger Finger and the result was recorded to file.
Wednesday, May 2, 2012
Displaying musical notation in iOS
In case you're wondering, the easiest way I've found to display programmatically generated musical notation on the iPhone is with VexFlow. It's a javascript library, so this means I have to put it in a UIWebView object through an html document that loads all the relevant files. To call the javascript functions, I send a stringByEvaluatingJavaScriptFromString: message to the UIWebView object. It all works very well, so that takes care of the uninteresting part of that project… now I get to learn all I can about algorithmic composition!
Wednesday, April 18, 2012
News from the trenches of augmented instrument design
Friday, October 14, 2011
Fall odds and ends
Saturday, July 9, 2011
Meta-trombone - part II
Wednesday, June 8, 2011
Playing the loop
!name trigger4Variable newFrame loopFrames / 16 * 3move newFrameend
Monday, March 28, 2011
My greatest hit
One can argue that existentialism has become the common sense of this century. After all, it does seem obvious that you are define by your actions, not by your intents and that the disconnect between what you want to be and what you are or between what you want the world to be and what it is can bring about all the agony familiar to those in the throes of an existential moment.
What I find particularly interesting are the demographics of the viewership. While males of all ages (68%) viewed my video, it is interesting to note that most of the female viewership are teenage girls. I really don't know what to make of this, but I find it interesting nonetheless.
A couple key findings, when I compare the popularity of this video with my other youtube offerings:
- build it and they will come (even if you dont't tell them about it);
- english is the language of the web (french is mostly ignored);
- if content is interesting, people will overlook technical flaws in presentation;
- folks like their animation;
- most referrals come from Youtube related videos.
Saturday, February 5, 2011
Sampling and sound design with Mobius
Start with a short loop
Interesting, but since the repetitions aren’t fast enough, it doesn’t sound like a musical note. There are two ways to make the repetitions faster. We could increase the Rate of playback, which increases the speed at which the loop is played, or we could make the loop even shorter, thus increasing the frequency of repetitions. There are many ways of doing this, but my favourite so far makes use of the loop windowing script Jeff Larson put together. Activating this script selects a single subcycle to play back. Using the script again selects another subcycle. In this way, a single loop can yield many different sounds. Here’s how things sound at this point using both of these options on the above loop.
Sound design
Sound examples
Example 1: Loopwindowed ->Discord3 -> Chopitch -> UpStereo
Example 3:Loopwindowed ->Rate+24 -> Discord3 -> Chopitch -> UpStereo
Example 4:Loopwindowed ->Rate+24 -> autoReverse -> Discord3 -> Chopitch -> UpStereo
Example 5: RateShifted -> Discord3 -> Chopitch -> UpStereo
Friday, January 28, 2011
Computational Epiphanies
This may appear trivial at first glance. After all, a great many people use computers on a daily basis to do a lot of different things. What I find interesting about this insight is that it can help me overcome one lingering problem with my refusal to specialized: the impossibility of dedicating enough time to completely master any subject. It's not that I'm lazy, but there's simply not enough time in a day to do it all at once.
However, I've previously learned that working on general skills (reading, writing, mathematics, logic) can yield benefits in all areas of interest. It's becoming clear to me that becoming a better computer user would make me a better composer or writer (or comic book artist or video filmmaker). This is why I've decided to dedicate a large part of this year's free time to learning the fundamentals of computer programming and software engineering.
I know some may be scratching their heads at this point, wondering how computer programming will make me a better musician/composer, comic book artist, writer, or filmmaker. Well, I'll try to explain...
Computers and music
This should be an obvious one given my current musical obsessions. More and more, I'm thinking about how to use my computer in novel ways to create interesting music both in performance and in the studio. Lately, working with Max/MSP or Bidule, I've often wished I were better at coding to create either plugins or software to help me realize my musical vision. And don't get me started on live coding (man that looks like fun, look at the video below). To help me get there, I've started working through a wonderful book I recently purchased: The Audio Programming Book.Algorithms are Thoughts, Chainsaws are Tools from Stephen Ramsay on Vimeo.
Computers and visual art
Computers and writing
I would prefer to adopt an approach similar to how we think of musical filters and effects in music production. An input text would be effected by a program following a set of instructions before displaying the result. I've recently become aware of the work of Daniel Howe whose RiTa toolkit for Java (as presented in his doctoral dissertation) aims to give writers the resources to harness the power of computation to produce prose and verse. There's also the folks at the Expressive Intelligence Studio at UCSC and the group running Grand Text Auto that bear watching.
Computers and computers
For all these reasons, I'm very excited about learning to code. In fact, I've already started by following introductory lectures in Java programming from Stanford University. This class has been great so far and I quite enjoyed coding my own version of Breakout and some other fun games. I'm grateful to Stanford for making these courses available freely and I intend to go through all the lectures for the introductory classes and, in time, some of the more advanced stuff as well. I've also signed up for a certificate in information technology at l'Université du Québec en Outaouais and I'm working through the math pre-requisite this semester.Thursday, December 16, 2010
Guitar controller part IV
Yet another instalment in my guitar controller saga… technically, this project is done. Everything works as it should, however there’s still some cosmetic work left to be done. You’ll notice it is still unpainted and the top part of the controller’s body is held in place with an elastic band… not optimal.
But it does work and here’s proof:
This controller is nothing by itself but a bunch of buttons. It only sends out MIDI notes and to achieve anything beyond that requires some programming. For this, I once again turned to Plogue’s Bidule.
In the patch I made, the neck buttons were divided into three groups that randomly select audio files for playback from three different pools: subjects, verbs and complements as spoken by my wife. That gives four buttons per grammatical groups and I decided to exploit that by assigning a different playback speed to each of these buttons.
The three central buttons on the guitar controller’s body trigger playback of the selected sound file. Pressing a neck button by itself only selects an audio file and does not produce a sound until one of the trigger buttons is pressed. The first two trigger playback from the beginning of the file and the third button triggers in reverse from the end. One thing I’d like to try is setting one of the first two buttons to pick a random start position… it might introduce some interesting possibilities.
This is how I implemented the guitar analogy for this patch. Pressing neck buttons is analogous to fretting a string (select audio file), while pressing trigger buttons is akin to picking notes. Together, these two actions produce a given sound.
The sounds are fed to the Mobius looper, which I control with my feet through my Trigger Finger.
Friday, November 19, 2010
Guitar controller – Part III
A quick update on the guitar controller to say that this project is going quite well. I’ve decided to forget the hacked USB keyboard approach since I couldn’t fix the latency issue. I bought a midi CPU from Highly Liquid and I’ve no regrets. This will give me a more stable controller down the road.
I also bought some low profile arcade buttons, since I was unsatisfied with all other SPST switches I’ve come across These switches have amazing response, require little effort to activate and jump back to open in a flash.
Next: Wiring all buttons on the neck, installing buttons on the guitar body and finding a tilt switch.
Monday, November 1, 2010
Dr. Sketchy
Last week, Carole and I attended the first anniversary of the Ottawa branch of Dr. Sketchy’s Anti-Art School.
Dr. Sketchy was founded in Brooklyn by an art school drop out and branches have been established all over the world. Dr. Sketchy invites artists and amateurs alike to draw the human form. Models are usually burlesque performers whose elaborate costumes certainly provide a break from drawing naked people. I can certainly use the life drawing practice…
Models took poses for progressively longer times, from 30 seconds to 20 minutes (which, for me, is either too short or too long). Here are some of my more deserving efforts… I’ll keep the rest to myself.
I tried to make a more finished drawing from that clown sketch above:
I’ll try to make it back to the anti-art school next month, after all the tuition is pretty cheap and I can use the practice.
Monday, October 11, 2010
Des chemins et des ruines
Last week I finished my second contribution to Les Fumettos du Cyclope (300 pages of cutting edge graphic narrative to be published in mid-December by Trip). Both of my stories for this book explore philosophical themes and I’m considering writing a few more like these and publishing them together as my philosophical comics.
My first contribution, À lire en cas de crise existentielle, is a first-aid kit for those suffering from an existential moment. It is loosely based on an earlier short animation film became my biggest hit on YouTube.
My second contribution, Des chemins et des ruines, is also based on an early short film. In it, I draw parallels between Socrates’ acceptance of his death sentence to remain faithful to his principles and the dilemma a young divorcé faces when she can’t reconcile her vow to live with her husband “until death” and her present situation. This comic asks the big question: must we die for our principles?
My personal answer will always be an echo of George Brassens: “Mourrons pour des idées d'accord, mais de mort lente.”
Here are a few images from the finished story and their corresponding thumbnail sketches. The thumbnail sketches are quite small compared to finished page (about an inch high), I do them quickly while I’m writing the script to help me visualize how the page will look and to know if I've put too much on a given page.
I inked these pages on the computer using Manga Studio and Photoshop in a traditional ligne claire style. Instead of colouring them, I scanned different fabrics and created a digital collage with them. Originally, the line drawing was supposed to give the outline to make that fabric collage and I wanted to remove the line drawing after it had served its purpose. However, I decided to keep them in the interest of legibility.
The tarot cards are small paintings on canvas paper I did with acrylic, gouache and ink.
By all accounts, Les Fumettos du Cyclope will be an amazing art book and I’m thrilled to be a part of it. I look forward to the book launch party at Le Cheval Blanc on December 18. See you there!


