For the WLM art project that I am working on in cooperation with Michael Johansson, I want to use the output of the shadow machine as a musical instrument. In concrete terms, the idea is to have the shadow images projected in the center of the tableau transformed into sounds in real time.

Test setup of the wandering landscape machine (as of September 2022). The white circle in the middle (“tableau”) is used as a canvas where shadow images are casted upon. These shadow images need to be converted into sound(s) in real time.

For this purpose, the ancient technology MIDI is used, a protocol that can be used to control synthesizers, electric pianos and groove boxes. A MIDI message looks something like this (in reality it is A LOT more complicated, but it is not neccessary to dig into this):

NoteOn 1 48 106

If this command were sent to an electric piano, it would result in a “C” with medium velocity. NoteOn means that a key is to be pressed. The 1 stands for the channel (= you could daisy chain several MIDI instruments together and each instrument “listens” to only one assigned channel), the 48 triggers the note itself with a velocity of 106 (range: 0 to 127). If the key is to be released, a corresponding command is needed:

NoteOff 1 48 106

If this NoteOff command is not sent the note that was triggered by the NoteOn will be played forever. So each NoteOn demands a corresponding NoteOff at some point. It all seems a bit antiquated, but also pretty straight forward. But how do we convert the shadow images into MIDI control signals? For this I wrote a Processing script, which is terribly hacky, but serves its purpose (You can find my spaghetti code in the addendum of this post. So scroll down, if you want to skip the explanations.). As a proof of concept this script does the following:

  • It reads the color values of the pixel from the very center of the image.
  • It determines the average color value from an imaginary line that runs from the center of the camera image to the top. (this function imitates the needle of a “record player”)
    This means that if the shadow image changes by altering the configuration of the wandering landscape machine, the pixel values will change.

To implement the camera library is used. This also reliably finds the Elgato Streamcam, which I want to use for this purpose. By using the get command and specifying the x and y coordinates (color c2 = get(cam.width/2, cam.height/2);) it is fairly easy to extract the color value in the middle of the live video. An array and a for loop take care of the “record player” requirement, i.e. averaging the pixel color values within a imaginary line from the center to the top.

I experimented with a small GUI for centering the camera to the shadow tableau and displaying where the color values are read in each frame. But ultimately I got rid of this experiment due to performance issues. (I will need to look into this in the future.)

Left window: the GUI for centering the camera on the shadow image. Right window: Printout of all MIDI-commands that are sent out in real time.

Problem 1: Java uses a really weird system for color values – e.g. when the variable c2 is printed to the console, I get values that look something like this (yes, that’s a minus in front of each value):


For a better usability I output the colors as color channel extracts in the value range from 0 to 255. This can easily be done with the following functions:

float c2red = red(c2); // red values
float c2green = green(c2); // green values
float c2blue = blue(c2); // blue values

Works fine. However, since the MIDI protocoll demands values from 0 to 127 for each command, but the RGB ranges between 0 and 255, these numbers still need to be adjusted. This is done using the map function:

float c2redMapped = map(c2red, 0, 255, 0, 127);

Naturally this produces odd decimal values. Since these cannot be used as MIDI commands, they still have to be converted to integers:

int c2redMIDI = int(c2redMapped);

Now it is possible to use these values. To enable Processing to send MIDI to the outer world, I use the library themidibus. First you have to tell themidibus the name of the hardware interface you want to use:

myBus = new MidiBus(this, "$Name", "$Name");

Personally I use a “Swissonic USB MIDI Interface” that I bought very cheaply some years ago.

The old USB 1.1 MIDI interface that is used in this prototype. MIDI uses a 5 pin DIN connection while the synthesizer (Arturia Microfreak) only features a 3.5mm jack. Therefore an adapter is needed.

Specifically, we need to find out what to use instead of $Name. To do this, we let the console output all detected MIDI devices:


🤖 Attention: MIDI is quite an ancient protocol. Accordingly, the devices are sometimes named cryptically (and in my case with umlauts that Processing does no recognize). I strongly recommend to use copy and paste from the console output here. For example, my device was only recognized if there were two (!) spaces behind the name (where normally a German umlaut would have been). So instead of USB-MIDI-Gerät I have to write USB-MIDI-Ger .

myBus = new MidiBus(this, "USB-MIDI-Ger  ", "USB-MIDI-Ger  ");

Once all this is done we can use

myBus.sendNoteOn($Channel, $Pitch, $Velocity);
myBus.sendNoteOff($Channel, $Pitch, $Velocity);

to send MIDI controls out of my laptop to the USB-Interface. In doing so, we need to replace $Channel, $Pitch and $Velocity with the mapped color values rounded to integers, e.g. like this:

myBus.sendNoteOn(1, c2redMIDI, c2blueMIDI);

I did this as a test setup with the Arturia Microfreak as receiver of these MIDI commands. That means, that a camera is actually in control of this synthesizer (and produces horrible sounds at the moment). From now on, my job will be to tinker the code in order to force this setup to produce somewhat “pleasant” sounds.

First proof of concept: Arturia Microfreak triggered by the MacBook’s webcam via MIDI in realtime.
The “somewhat” final setup
The whole setup as an early prototype

  • The MIDIbus library: Link
  • MIDI monitor, a nice Mac app that lists all MIDI commands that are sent/received in real time: Link
  • The Arturia Microfreak, the small hardware synthesizer that is used in this experiment: Link (Link to the German manual)
  • VCV Rack, a software “Eurorack Synthesizer” that was used before the Microfreak arrived: Link
  • MIDI Kompendium: Link (German)
  • 🐱 Very high recommendation: The chapter about MIDI in the Introduction to Computer Music from Indiana University: Link

Stuff I want to look into in the future

  • a MIDI synthesizer/playback tool made in Processing 3/4: Link


Below you find the whole Processing script as of today (February 13th, 2023). This script is 100% work in progress and still in heavy development (i.e. I tinker with it until something falls apart). Use it on your own risk.

import*; // camera lib
import themidibus.*; // MIDI lib

MidiBus myBus; // The MidiBus
Capture cam;

void setup() {
  size(640, 480);
  cam = new Capture(this, 640, 480, 5);
  myBus = new MidiBus(this, "USB-MIDI-GerŠ  ", "USB-MIDI-Ger  Š");

void draw() {
  if(cam.available()) {;
  image(cam, 0,0);

  // get color at center of image
  // todo: could be replaced with Record[cam.height/2] etc.
  color c2 = get(cam.width/2, cam.height/2);
  // get color in line from middle to top
  // "record-player-approach"
  // initialize array 
  int[] Record = new int[int(cam.height/2)];
  for(int i = 0; i<cam.height/2; i++){
      color c1 = get(cam.width/2, i);
      // extract only red color channel 
      float redArray = red(c1); 
      // map color value range to MIDI parameter range 
      map(redArray, 0, 255, 0, 127);
      // cast float into int
      int redRecord = int(redArray);
      // write current value into array
      Record[i] = redRecord;
  // calculate average of array
  float average = 0;
  for(int j=0; j<Record.length; j++){
         average += Record[j];
  average /= (float)(Record.length);
  int average2 = int(average);
  // draw rectangle with current array average color
  rect(20, 20, 40, 40, 20);
  // convert the strange java colors to something more readable
  float red = red(c2);
  float green = green(c2);
  float blue = blue(c2);
  // map each color range (0-255) to MIDI paramter range & convert to int
  int MIDIred = int(map(red, 0, 255, 0, 127));
  int MIDIgreen = int(map(green, 0, 255, 0, 127));
  int MIDIblue = int(map(blue, 0, 255, 0, 127));

  int channel = 2; // seems to send on channel+1; dont't know why
  myBus.sendNoteOn(channel, MIDIred, MIDIblue);
  myBus.sendNoteOff(channel, MIDIred, MIDIblue); 


void delay(int time) {
  int current = millis();
  while (millis () < current+time) Thread.yield();