simpleKinect Application

simpleKinect is an interface application for sending data from the Microsoft Kinect to any OSC-enabled application. The application attempts to improve upon similar software by offering more openni features and more user control.

The interface was built with Processing, utilizing the libraries: controlP5, oscP5, and simple-openni. Because I used open-source tools, and because the nature of the project is to stimulate creativity, simpleKinect is free to use.

simpleKinect Features

  • Auto-calibration.
  • Specify OSC output IP and Port in real time.
  • Send CoM (Center of Mass) coordinate of all users inside the space, regardless of skeleton calibration.
  • Send skeleton data (single user), on a joint-by-joint basis, as specified by the user.
  • Manually switch between users for skeleton tracking.
  • Individually select between three joint modes (world, screen, and body) for sending data.
  • Individually determine the OSC output url for any joint.
  • Save/load application settings.
  • Send distances between joints (sent in millimeters). [default is on]

simpleKinect.


simpleKinect FAQ page

Note: This app was exported as a Mac version. If you are a PC user, and interested in grabbing a beta PC version, you may visit the FAQ page to download.

simpleKinect is by Jon Bellona

Kinect-Via- Interface Series

Kinect-Via- is a Max/MSP interface series for composers wanting to route and map user-tracking data from the XBox Kinect.


White paper (.pdf)

Kinect-Via-OSCeleton. (.zip) OSCeleton application

Kinect-Via-Synapse. (.zip) Synapse application

Kinect-Via-Processing. (.zip) Processing library

Kinect-Via-NIMate. (.zip) NImate application


NON-MAX/MSP USERS: The stand alone application of Kinect-Via-Synapse may be downloaded here.

NON-MAX/MSP USERS (BLENDER): For those working with Blender (and don't use Max), I created a stand alone application of Kinect-Via-Synapse that sends individual OSC messages for joints. This type of messaging will work with Blender's AddOSC plugin (as of 10/2016). This standalone version, 1.2.2, may be downloaded here.




What is the Kinect-Via- interface series?

Kinect-Via- is a Max/MSP interface series for composers wanting to route and map user-tracking data from the XBox Kinect. The interface series complements four different OpenNI applications, namely OSCeleton, Synapse, Processing's simple-openni library, and Delicode's NIMate. All Max/MSP interfaces communicate using OSC (Open Sound Control) messages and are performance-ready, meaning that all routing and system options may be changed in real time. The Kinect-Via- interfaces offer a tangible solution for anyone wishing to explore user tracking with the Kinect for creative application. Please read the documentation in the .zip files to learn more. Note: Latest has been tested with Max 5.1 and OSX 10.6.8.

Kinect-Via- FAQ page

You can also Shout! if you have questions. jpbellona [ ] yahoo [ ] com

Kinect-Via- is by Jon Bellona


Since I get questions about models and hardware, below I have provided links to related gear. I want to point out what I have tested with or used in projects and what I have not.

The original Kinect XBox 360 and tripod stand I've used in my software and projects. The third item is a cheaper refurbished XBox 360, and should be the 1414 model, but I have not ordered before.
The newer XBox One. Not tested with above software. Please let me know if it works!




Projects using simpleKinect


Casting: Kinect, Kyma (2013)


Casting is a real-time composition for a single performer using the Microsoft Kinect and Kyma. The piece embodies both the programmatic and the magical use of the term. By giving form to gesture that conjures sound and visual elements, a body's movement becomes intertwined with the visceral. The performer's body 'throws' and controls sound, enabling the viewer to perceive sound as transfigured by motion. In this way, music becomes defined by the human mold of the performer and listener.


AV@AR (from Manfred Borsch) (2014)

The interactive and audiovisual installation av@ar puts the relationship between people and their medial reflection at the center of the experience. In this closed circuit installation, the control, dependencies, and aesthetic reference levels can be explored and controlled interactively with the entire body. A catalogue of emotions serves as audiovisual communication material and leads to a reflection about the systematic use of feelings. The dance with the medial mirror opens a space of experience in the tense atmosphere between the non-digital and the digital ego - the avatar.


Steamfields (Bodyscape excerpt) (2014)

Mary Mainsbridge plays the Telechord, an instrument she developed using simpleKinect, Max/MSP, and Modalys. This is part of her PhD research into movement-based interactive music performance. The video is an excerpt of her performance at the Sydney Fringe Festival, in Sept. 2014. Robbie Mudrazija is on drums. Learn more about Mary and her work by visiting deprogram.net


On the Fragmentation of Memory n.5 (from Aisha Pagnes) (2016)

Aisha Pagnes's installation aims to explore the transference, translation and malleability of autobiographical emotive memory. This single viewer experience enables the interactive encounter with the echoic memory of others. Her work asks the question, "Can the translation of autobiographical memories into sound effectively evoke the associated emotions within others?" Learn more about Aisha and her work by visiting http://cargocollective.com/aishapagnes/On-the-Fragmentation-of-Memory-n-5



Projects using the Kinect-Via- interface series


Human Chimes: Kinect, Processing, Max/MSP/Jitter, Kyma and video projector (2011)

Human Chimes is an interactive public installation. Participating users become triggered sounds that interact with all other participating users inside the space. By dynamically tracking users' locations in real time, the piece maps participants as sounds that pan around the space according to the participantsÕ positions. The Kinect mapping is using Kinect-Via-OSCeleton.


The Beat: Kinect, Max/MSP/Jitter, Ableton Live, Synthesizer (2012)


The Kinect user's hand and head movements mapped to filters, and at times, hand gestures actuate sound. The Kinect mapping is using Kinect-Via-Synapse. "The Beat" is a composition by Nathan Asman.


Juggling Music (from Arthur Wagenaar) (2012)

Playing music by juggling with glowballs! Demonstration of this new self made musical instrument, controlled by juggling. Also known (in Dutch) as 'De Kleurwerper'.


Life in 3D Installation (from Alex Galler) (2013)

Alex Galler and Team use Kinect-Via-Synapse as a way to transform user's movements into pan and audio controls within a multi-channel environment.


Check out Alex explain how his team's system works.