Table & Chairs

A Project of Table & Chairs in Seattle, WA

Racer Session #114 | netcat | April 15, 2012

Communication is a part of life.  People communicate.  With inventions like the internet, cell phones, Facebook, etc. communication is more implicit and more pervasive than ever before: we are always communicating with one another, and we often are unaware of it.   For example, while you may not know it, your computer is almost continuously chatting away to all the other computers it is connected to by radio signal or by a wire.  In most instances, communication between computers is implicit in the use of a computer.  The phenomenon is hardly limited to computers – when we meet someone new, we learn a great deal about them even without exchanging very many words.  The communication is implicit and continuous, and we may not know it is happening.

Implicit communication is a part of improvised music too. We communicating with one another on stage, but as with communication in life and computers, it happens implicitly, and we may be unaware of it.  However, awareness of this communication between musicians is important.  Focusing on communication between performers during the session, our composed pieces and free improvisations can be vastly improved as a result. Join us Sunday to think about the role of communication in music.

Our piece: netcat

Throughout the piece, we will be hyper-focused on listening to what is happening, and reacting to it accordingly. We have combined computer-generated communications with the human ability to feel, plan, and react. Our primary goal in performing this piece is to highlight the difficulty and importance of communication between performers.

To do so, we are using 8 computer “performers” in our set.  Each computer generates network traffic. Custom software we’ve written translates the generated network traffic into musical sounds that the audience can hear. Some of the network traffic in our piece is carefully prepared, and some some will be completely spontaneous and out of control – much like any piece of music that involves improvisation.

On the human side of things, we will control the 8 computers we set up, and will generate network traffic improvisationally by manipulating computer hardware and software. We will also play drums and guitar, reacting to the communications generated in real-time.

Guidelines for free improvisations

We’d like to impose some constraints on the improvisations tonight, to focus on the idea of communication:

1. Be aware of what’s going on around you, and what is being communicated.

You can choose to react or not, but you have to be aware before you can make the choice.

2. Limit group size to 4.

The more participants in an improv, the harder it becomes to focus and listen. We expect smaller groups will make it easier to focus on listening and communicating.

3. No discussion/planning before each improv.

Starting with preconceived notions of what an improv should be may lead players to focus on the “plan”, rather than staying aware of their surroundings.


Implementation Details

The rest of our post details the technical aspects of what we have built.  Non-geeks can stop here.  For the rest of you, we describe the hardware setup, the software we wrote, and how to configure the software we didn’t write, if you want to replicate all or part of what we did.

The computers
We each have a machine that is running the custom software we wrote. In addition, there are 6 other machines that are generating and sending traffic to David’s computer over a wired local-area-network.

The software
All source code is available on GitHub at: https://github.com/blucia0a/Network-Music

Our software is split into 5 parts:

1) Virtual MIDI keyboards
We found out that we could easily create multiple virtual MIDI keyboards without writing a single line of extra code. To do that, we used OS X’s “Audio MIDI Setup” program.

Inside that application, under Window > MIDI Window, is an option for a thing called “IAC Driver”:

Screenshot: http://cl.ly/0H3Z143U2y1W1d2v2y3s

Once you bring that device online, you can then add as many ports as you like (we created 6), which will each correspond to one virtual MIDI keyboard:

Screenshot: http://cl.ly/2w2r3y3Z2r1T2X062D3O

Once that’s configured, you should have 6 virtual keyboards that you can map in your audio rack (Rax, MainStage, Reason, Abletone, etc):

Screenshot: http://cl.ly/280X1y2T2G1H392O293w

2) tshark
We use the tshark (v 1.6.5) program to intercept all traffic that comes over our computer’s network card. tshark is part of the Wireshark distribution, and can be installed via Homebrew on OS X, or probably as part of a binary Wireshark distribution.

https://github.com/blucia0a/Network-Music/blob/master/get_traffic.sh

3) Sift.pl

Sift.pl is a Perl script that we wrote that takes tshark output, and converts it into MIDI messages destined for one of our 6 virtual MIDI keyboards. It translates network packets into (keyboardNumber, note, duration, velocity) tuples, and outputs those in real-time.  Sift uses network packet properties to decide which MIDI command to use.  Each different host (identified by IP) has a different keyboard number.  The set of notes each host should play and the duration of a hosts notes can be defined in a configuration file.  Sift selects randomly from the specified set of notes and selects a duration within the specified range.  The velocity is a function of the TCP port of the network traffic.  

https://github.com/blucia0a/Network-Music/blob/master/Sift.pl

4) Streamy

Streamy is a multi-threaded C program that listens for output from Sift.pl, and plays our virtual MIDI keyboards in real time.

When it receives a message like:

1 60 1000000 127

It will play middle C (60) on keyboard 1, for a duration of 1000000 microseconds (1 second), and a MIDI velocity of 127 (MIDI velocity ranges from 1-127).
https://github.com/blucia0a/Network-Music/blob/master/Streamy.c

5) Rax

Rax is a program that allows us to wire up MIDI keyboards to software instrument patches. Apple’s MainStage does the same thing, as does Ableton Live, Logic Pro, etc.

Screenshot: http://cl.ly/1H2k361W3L2o43101B11

We basically just map our 6 virtual MIDI keyboards to software MIDI patches on our machine, and whenever Streamy “presses” a key on our virtual keyboards, Rax receives the MIDI message and plays a note.  Thanks, Rax!

http://www.audiofile-engineering.com/rax/