At the start of September this year I applied for a new job at the University of Sydney. After a Zoom interview and a coding exercise, I found out at the end of the month that I'd got the job, but I didn't get a letter of offer for a couple of weeks after that. So I started October in a strange and anxious in-between state - unhappy with my old job, but not yet certain that I had a new one - and I badly needed something to distract myself.

I noticed that a few of the people I follow on Mastodon were doing something called Looptober and posting a short track every day. I was a couple of days late, and had no idea if I'd be able to do something worthwhile for every day, but it was just the thing I needed to keep myself occupied and happy while I was waiting.

It also made me get my SuperCollider codebase into better shape. SuperCollider is an open-source music platform which I'd starting teaching myself in the 2020 lockdown, as one of those COVID creativity projects. I'd found that if I set aside two or three hours to make music with it, most of the time would get taken up with coding and I wouldn't end up creating or recording any actual music. Both coding and playing music get me into the sort of flow state I really enjoy, but the two states seem to be very different, and it's hard for me to switch gears from one to the other.

The good thing about having a creative goal every day is that it forced me to streamline things so that I could sit down at the laptop, plug in the MIDI controller and headphones, and start recording with a minimum of fiddling around. It's made me feel much more at home in SuperCollider, and I've built a setup in which I can bounce between the laptop and the MIDI controller without getting too much whiplash.

Recording a track every day is also a good way to head off perfectionism. I'm pretty happy with the end results, although I now know that I know very little about mastering and the levels are all over the place.

Here is the SoundCloud playlist: Looptober 2021

I found out that I got the new job in the last few days of the month, so I've been busy coming up to speed with the new team and hadn't had a chance to look at this code until now. Although it's quite usable, it's also a mess: it has the feeling of a room where someone's been very busy for a month and then suddenly packed to go on an overseas trip. I tried to make sure that I saved a file for each day's session so that I could revisit the best tracks and work them up into something more professional, but that's work for another day.

The rest of this post is a walk-through of a tidied-up, idealised version of one of my looptober scripts, with some explanations of basic SuperCollider concepts and how the setup fits together. The full codebase, warts and all, is on GitHub: Looptober 2021

sclang

sclang is SuperCollider's programming language. It's very much from the Smalltalk family of languages: everything, including numbers and strings, is an object, almost everything can be invoked as a message on an object, and it encourages interactive and explorative coding. I find myself falling into what I think of as an old-fashioned and somewhat unprofessional style of coding, where I set up a lot of global variables for synths and buffers and then link them to controllers. This is fine for patching together an environment to play music, but not really sustainable for making reusable code. I've made a bit of progress on better-designed libraries for interacting with touch-screen devices and MIDI controllers, but this post doesn't cover those.

The basic abstractions in SuperCollider are Synths and Busses. In sclang the classes they represent are capitalisd, so when I write "Synth" I'm talking about the sclang thing, and when I write "synth" I mean the musical instrument. A Synth emits a signal - this can either be audio or a control signal, and can have one or more channels - and a bus relays a signal between synths, or from an input or to an output device. Synths can take audio or control signals as inputs, which is how filters and other effects work.

In terms of traditional music gear, a Synth doesn't really correspond to a (keyboard) synth: it's more like a note, so when you press a key on a MIDI controller or keyboard, SuperCollider creates a Synth, which generates the signal for that note until it ends (either by decaying on its own, for percussive sounds, or after you release the key). Some Synths are transient, but some are not: the effects chain is a bunch of Synths, each of which stays running.

In programming terms, and despite the fact that I called it a "class" up there, it's useful to think of a Synth as a function which takes one or more parameters and emits a signal.

Setting up

SuperCollider has a client-server architecture. The server does the CPU-intensive work of generating the signals and music, and the client is an IDE in which the interpreted sclang code runs. When I talk about 'the client' I'll generally be referring to some code running in the IDE, but that code could be a one-liner I've just executed, a sequencer of some kind, or something triggered by me playing a MIDI controller. From the server's point of view, all these look the same.

After you start the IDE, you have to boot the server (which is represented by the global s). The line Server.killAll is there because if there's already a server running, which happens sometimes, s.boot will give an error.

After that is the setup for some basic scheduling parameters: BPM, the number of beats in a bar, and the length of a bar in seconds. This illustrates two of the points of sclang syntax I find a bit perverse: variable names with more than one letter need the ~ prefix, and multiline statements need to be wrapped in round parentheses.

sclang's paradigm is that of a REPL which is somehow running in an editor window: you hit ⌘-return to execute a line of code, or a block of lines enclosed in parentheses.

Server.killAll;
s.boot;

(
~bpm = 116;
~beatsperbar = 4;
~buflength = ~beatsperbar * 60 / ~bpm;
)

Getting the setup working involves going through the script and executing it bit by bit, which seems clunky, but allowed me to tweak things as I went, usually basing one day's script on the last. The code which I didn't want to tinker with was stored in its own files.

Effects

The first of these is a basic effects chain, which is loaded sclang-style by calling the loadRelative method on a string representing the file's relative path. This is followed by setting a bunch of parameters on the different effects Synths.

("./effects.scd").loadRelative();

(
~delay.set(\amp, 0.0);
~delay.set(\decaytime, 1.5);
~reverb.set(\amp, 0.8);
~reverb.set(\room, 2.4);
~reverb.set(\damp, 0.5);
~reverb.set(\reverbmix, 0.5);
~mixer.set(\amp, 0.3);
~grains.set(\out, ~fxb);
~grains.set(\speed, ~bpm / 240);
~reverb.set(\room, 2);
)

Before a Synth can be used, it needs to be defined with a SynthDef, which is essentially a binding between a sclang symbol - the name of the SynthDef - and a function which contains the code which will input parameters and signals and output something. The SynthDef is compiled and passed to the server. After that, the client can ask the server to create and play Synths based on the definition, which it can do very efficiently, as it doesn't need to compile each one in turn.

This post won't go into the internals of Synths, as I want to talk more about what goes on outside them, but it's worth having a look at one of the effects SynthDefs. This defines an LFO synth and binds it to the symbol \lfo. It then immediately calls play on the SynthDef, and assigns the resulting Synth to a variable ~lfo1.

~lfo1 = SynthDef(
    \lfo, {
        arg out, freq=0.4, freqlo=0.01, freqhi=20, amp=0;
        var mfreq = freq.linexp(0, 1, freqlo, freqhi);
        Out.kr(out, SinOsc.kr(mfreq, 0, amp));
    }
).play(s, [ \out, ~lfob ], \addToTail);

This is a control Synth - it generates a low frequency sinewave which is used to modulate the filter Synth. The idiom of defining a SynthDef and then immediately creating an instance with play is common for Synths which are meant to stick around as part of a signal chain.

The Synth's arguments are out, freq, freqlo, freqhi and amp. out is the Bus to which the control signal is sent, ~lfob.

Note the \addToTail symbol which gets passed to the play method: this is a huge SC gotcha. Synths on the server have an order, from first to last, which determines how signals flow through them, so filters must be downstream of the Synths which they are processing. By default, new Synths are added to the front of this list, which, for filters, is almost always not what you want. Failing to remember this doesn't throw an error, it just leads to maddening silence from your effects chain, which is patiently waiting for an input from something which is being computed after it runs.

Synths

("./synths.scd").loadRelative();

The next file is the musical synths - an elementary drum kit, an FM synthesiser, some square and saw waves, and some interesting resonators. I didn't want to spend too much time on audio synthesis, as I find that it takes me ages, so for Looptober I mostly grabbed Synths I'd already developed.

This file shows the other idiom for SynthDefs, where they are sent to the server with the add method, and don't get instantiated until they are played. For example, a very basic hi-hat:

SynthDef(\hihat,
    {
        arg out=0, amp=1, pan=0, filter=1000, atk=0.01, rel=0.1;
        var sig, env;
        env = EnvGen.kr(Env.perc(atk, rel, amp), doneAction: Done.freeSelf);
        sig = HPF.ar(WhiteNoise.ar(), filter);
        Out.ar(out, Pan2.ar(sig * env, pan));
    }
).add;

MIDI knobs

SuperCollider comes with a set of classes for responding to MIDI inputs. midiknobs.scd creates a bank of control Bus objects in an array called ~knobs, and then assigns a function to listen to the eight knobs on my controller and set the ~knobs Busses when events come in.

The second block of code shows the map method, which instead of setting a parameter on a Synth, binds it to the signal from a control Bus. So, for example, mapping the signal from ~knobs[2] to the \freq parameter of the Synth ~filter allowed me to tweak the filter with the knob in real time.

("./midiknobs.scd").loadRelative();

(
~bufrecorder.map(\mix, ~knobs[0]);
~reverb.map(\reverbmix, ~knobs[1]);
~filter.map(\freq, ~knobs[2]);
~lfo1.map(\freq, ~knobs[3]);
~lfo1.map(\amp, ~knobs[4]);
~grains.map(\blur, ~knobs[7]);
)

~grains is part of a setup I use for doing weird things with my acoustic guitar: there's a ~bufrecorder Synth which takes the audio input from a Scarlett A to D box and records it to a buffer, and then ~grains is a granular synth which plays back the guitar audio at different speeds and trigger rates. I spent a lot of time hacking it so that the granulator buffer played in time with the sequencer, and then didn't use it for many of the tracks, but it's good that I got it working.

Sequencing

SuperCollider has a fairly sophisticated patterns library for building complicated sequences, but I've found that I can't write music quickly with it. What I really wanted was to be able to tap out rhythms on my LaunchKey, and have some SuperCollider code record the MIDI events, save them, and play them back, without any rhythm quantisation or fancy stuff.

So here's how my basic sequencer works. A note is a fairly simple data structure:

  • the pitch
  • the volume
  • the time the note starts
  • the time the note ends
  • the name of the Synth which is being played
  • the other parameters, which depend on which Synth is being played
  • a label, which is there so that I can perform actions on groups of notes

The sequencer does two things: it builds a list called ~notes based on incoming MIDI events, and then plays them back. SC provides a class called TempoClock, which gets instantiated using the BPM settings defined at the top of the script. The TempoClock is used to record the timings of the notes as they are being played. There's a loop which runs every bar which schedules events on the TempoClock for each of the notes being played, and also for being released (if they need it).

There's a separate file, midikeys.scd, which has the MIDI event bindings, and which also handles playing (and releasing) the notes as they are pressed.

(
("./sequencer.scd").loadRelative();
("./midikeys.scd").loadRelative();
)

Most of the coding I did for Looptober was getting the finicky logic of recording the notes right, especially for those Synths which need to be released.

The fun part of the sequencer is how I got it to map MIDI key events to sounds. This is done with a data structure called ~insts, which defines a set of instruments mapping a set of pads or keys on the LaunchKey. An instrument has three things:

  • the Synth it's going to play
  • a range of keys it will be played from
  • an sclang function which gets evaluated when you press a key, and returns a note data structure as defined above

Although it's an object-oriented language, sclang's syntax for defining a function is nice and concise: for example, the following adds its two arguments:

{ | x, y | x + y }

Here's an example of an ~insts which binds some drums to the pads and an FM synth to the keyboard:

(
~insts = [
    [ \kick,  4,  { |i| [
        \out, ~filterb,
        \hi, i * 1000 + 100,
        \lo, 60, \noise, 0.7,
        \rel, 0.5,
        \pan, 0.4.rand - 0.2 
        ] } ],
    [ \snare, 8,  { |i| [
        \out, ~filterb, 
        \hi, (i - 4) * 1000 + 1000,
        \cfreq, 13000,
        \rel, 0.1,
        \pan, (i - 5.5) * 0.2 
        ]  }  ],
    [ \hihat, 12, { |i| [
        \out, ~filterb,
        \filter, (i - 8) * 2000 + 2000,
        \rel, 0.1,
        \pan, -0.1
        ] }, ],
    [ \fm_basic, 41, { |i| [
        \freq, ~scale.degreeToFreq(i - 12, 110, 0),
        \rel, 2,
        \atk, 1.5
        \pan, (i - 16.5) * 0.1,
        \mRatio, 1 + (~knobs[6].getSynchronous * 10),
        \cRatio, 1.6,
        \out, ~filterb
    ] } ]
];
)

The i parameter to each function is the MIDI note value from the controller: in this version of the code, the velocity (ie volume) gets taken care of by the sequencer. The final version is a bit more functionally pure and has the velocity passed to the function as a second parameter.

Note that the FM synth is getting its \mRatio parameter from one of the controller knobs. Around the middle of October I decided that I wanted this code to be even more functional, and that the ~insts should be run every time the note is played by the sequencer, rather than just once when I pressed the keyboard. I got it working this way - you can see an example in looptober 18 - and it enabled me to do fun things like modulate the pitch of a drone with a sequencer pattern, as in looptober 19. But I ended up changing it back, for two reasons.

First, the ~insts structure got too verbose when every function had to have a full Synth call. Secondly, and more importantly, the first version, by capturing the parameters for a note when it was played, had the nice effect that it captured the settings of any control knobs at the time I played the note, and played those values back when the sequence repeated. By contrast, the second version would play the notes back with the control knob settings at the time of playback, rather than capture.

Both of these are musically useful - it's the difference between recording a knob tweaking and making that performance part of the loop, and having a sequence play back and then tweak the effects over the top - and a future version of the code should allow me to choose either approach, especially because the eval-on-playback style allows notes to do anything they want to other running Synths.

I appreciate that what would usually be a subtlety of coding style - where and when a closure gets evaluated - is reflected so clearly in how all this works as a musical instrument.

Another nice and unintended consequence of the overall design is that once I got confident with it, I could build up tracks by redefining the ~insts data and recording new patterns over the top of existing ones. Once notes are recorded with the sequencer, they don't depend on the contents of ~insts. This is as close to live-codeing with SC as I've ever got, and is much more intuitive than building Patterns.

To-do

I wrote a simple function to write the contents of ~notes out as a file, so that patterns could be saved and then reloaded, but left it in a somewhat broken state when I added output buffers as a parameter in the ~insts functions, so that needs work.

There's a very basic function called metronome which populates the ~notes list with a regular beat, but it would be possible to take this much further and build up sequences and patterns programmatically.

The next feature I want to add is beat quantisation, which TempoClock should make fairly simple.

Highlights

Here are my favourite tracks:

Day 9: a sequencer bug

Day 12: fun with resonators

Day 20: something to do with Outer Space

Day 22: melancholy fanfare

Day 23: a Mastodon listener asked for a seven-minute version of this one

Day 27: the most successful guitar loop track

Day 30: trying for that last track on a Flying Lotus album feel

Day 31: weird pinball