The aim of introducing interactivity in this piece was to engage a live performative environment where the movement is able to affect the sound and the image. To accomplish this task in a meaningful way, something beyond a 1 to 1 connection between sensors and effects parameters should be created. Data is not information until it is analyzed and transformed. In this direction, different pattern and gesture recognition algorithms are applied to the values coming from the sensors in order to extract the key features out of the choreography. This technique allows the performer to establish a set of movements to create a new language within the scope of a particular performance. Max was used in this case to process the sensors data. The show project was setup in VDMX with various custom created Quartz Composer effects to allow the distortion of the real time camera. Two openFrameworks applications were created, one was a particle system based on a flock system and a second one a drawing system.



