Running Expressions

Running Expressions is a real-time performance composition using bio-feedback and remote controllers. Written primarily in Kyma and Max/MSP, the piece captures live physiological data to create and control music within an 8-channel and video projection environment. The musical performance narrates a distance run, the psychological and emotional impacts of a running experience.

+ Download Documentation .pdf and the performance software (Max/MSP/Jitter, OSCulator, and Processing) files. (.zip, 11.5 MB)

+ Download Kyma performance audio files. (.zip, 45.3 MB)

+ Download Thesis documentation separately. (.pdf, 11.2 MB)

Play! Sequence

Play! Sequence is a multimedia installation for iPod Touch, USB camera, and VGA video display and TouchOSC, Max/MSP/Jitter, and Isadora software applications. By creating a multitouch sequencer that controls the playback of audio and video masks, Play! Sequence enables the user to simultaneously interact with the space’s sonic and visual environment.

The iPod Touch provides a familiar language for the user and for the nature of the tactile interactions. The user is allowed to create, edit, and delete three synchronous sequences of sixteen steps, thereby changing the evolution and the complexity of the piece over time.

Each of the three sequences represent a sonic timbre and color mask that mirror the user’s actions. With each sonic timbre, the user has control over pitch, rhythm, and amplitude. The color masks follow the sounds across the screen, repeating from the left upon the start of each loop. The masks help visualize the user’s tactile and sound experience by revealing the user inside the space, and each mask represents one of elements in the RGB color model.

Play! Sequence operates within the framework of natural human interaction, playing off of our curiosity and our engagement with objects that we can creatively control. The user manipulates and interacts with the sounds and visuals in real time, driven by the immediate feedback that the system provides.

Kinect-Via- Interface Series

Kinect-Via- is a Max/MSP interface series for composers wanting to route and map user-tracking data from the XBox Kinect. The interface series complements four different OpenNI applications, namely OSCeleton, Synapse, Processing’s simple-openni library, and Delicode’s NIMate. All Max/MSP interfaces communicate using OSC (Open Sound Control) messages and are performance-ready, meaning that all routing and system options may be changed in real time. The Kinect-Via- interfaces offer a tangible solution for anyone wishing to explore user tracking with the Kinect for creative application. The interface currently has over 1000 downloads globally. Note: Tested with Max 5 and OSX 10.6.8.

White paper (.pdf)

Kinect-Via-OSCeleton. (.zip)
OSCeleton application

Kinect-Via-Synapse. (.zip)
Synapse application

Kinect-Via-Processing. (.zip)
Processing library

Kinect-Via-NIMate. (.zip)
NImate application

Projects utilizing Kinect-Via-

Human Chimes. Human Chimes is an interactive public installation. Participating users become triggered sounds that interact with all other participating users inside the space. The Kinect mapping is using Kinect-Via-OSCeleton.

The Beat. The Kinect user’s hand and head movements mapped to filters, and at times, hand gestures actuate sound. The Kinect mapping is using Kinect-Via-Synapse. “The Beat” is a composition by Nathan Asman.

Juggling Music (Arthur Wagenaar). Playing music by juggling with glowballs! Demonstration of this new self made musical instrument, controlled by juggling. Also known (in Dutch) as ‘De Kleurwerper’.

The Goddess Re:Membered

Commissioned for the 2011 Fringe Festival, The Goddess Re:membered is a site specific work and multimedia response to The Goddess, a classic Chinese silent film from 1934. The interactive installation is for video projection, IR camera, Max/MSP and Isadora software. Through public interactions of users inside the space, clues to distant memories are revealed through the triggering of color, sound, and video masks.

[Coded] In Passing

In Passing articulates the journey of an interactive dialogue between performers, where the germinal communicative motive becomes entangled as the conversation evolves. The clarity and complexity of the conversation between the performers takes form through video projection and live performance. The unfolding progression of this fluctuating relationship seeks to draw attention to interpersonal relationships.

with: James Bean, Emily McPherson, Mark Knippel, Mike Stephen, and Katherine Spinella