The Interpreter is an interactive installation that invites visitors to explore visualizations and sonifications of motion data from a pre-recorded dance sequence.
This installation is based on “Breakdown”, an audiovisual dance performance that was presented at “Ears, Eyes and Feet” concert in the B. Iden Payne Theater, UT Austin, May 2014. During the rehearsals, sequences of movements from the dancer were captured with an Xbox Kinect camera and saved to a database. Each movement sequence is represented as points in space and time that are related to the silhouette and skeleton of the dancer. The interactive system is constantly reading the motion data to generate real time graphic movement visualizations, paths from displacements in the space, morphing of the silhouette’s shape over the time and geometric patterns by points and lines connected to virtual body joints.
There are 4 different visualization modes modes:
0. Dancer silhouette;
1. Circle on the center of the body;
2. Horizontal Lines connecting body ends;
3. Body Polygon.
Through a touch interface the audience chooses the mode according the number of fingers of the screen, and also my moving the fingers vertical and horizontal it is possible to change perspective and viewpoints around the 3d structure.
The background sound is generated by a granular synthesizer that is controlled by the dancer movements. The granulator re-processes the bass and emits discrete grains. The quantity of motion of the performer’s hands is correlated with grain density. The area of the bounding box around the performer is correlated with the center frequency of a low pass filter. Also depending on the mode a different soundtrack is played over the background (taken from the Breakdown performance).
“The Interpreter” was exhibited at INTER-FACE : International Conference on Live Interfaces 2014 (Lisboa). Read the extended abstract here.
By: Rodrigo Carvalho (Visuals, Interactive System) | Yago de Quay (Music, Interactive System) | Shen Jun (Dance)