VISIOPHONE

SuperShape3D

CODE, REPORT

No Comments

Inspired and built over the code of Daniel Shiffman‘s CodingRainbow tutorial on SuperFormulas, and the work of Reza Ali.

I made an app on Processing to generate and control supershapes on realtime, which I used to perform live visuals at BoomFestival2016 (used during the set of Krumelur at the Alchemy Circle Stage)

Find the code here on my github.

It is still an unfinished job, with a big to-do list. But it is stable an fun. It sends signal through Syphon so you can mix it with other visual content, FXs and software. Audio amplitude changes color cycles. I sent the Syphon’s signal to VDMX, and then mixed with QuartzComposer compositions.
It has (on SaveShape TAB) an array to store shapes, which will loop then autonomously with the function shapeShift(int x) active.

Works on Processing 3. Needs Syphon, PeasyCam, processingSound and controlP5 Processing’s libraries.

Explore it, Improve it, Share it and have Fun.

vjBoom04

vjBoom02

vjBoom03
// Some pictures from the Alchemy Stage with the SuperShapes being projected.

Boom_superformula_02

Boom_superformula_00
// Some Supershapes’s stills.

supershape. Control GUI
// SuperShapes control GUI, with parameters to modify the shapes, and the bottom to trigger autonomous shape-shifter function.

Read more

REPORT FROM THE CHOREOGRAPHIC CODING LAB, MOTIONBANK. FRANKFURT 2013

CODE, EXPERIMENTAL, REPORT

No Comments

Last November (25th-29th) I took part of the Choreographic Coding Lab, from the MotionBank Project. An exploratory laboratory focused on translating aspects of dance and choreography into digital forms.

Below there are the two project/prototypes that I was working during that week.

1// LINES

I started by digging into the motion data from the performance of Jonathan Burrow’s and Matteo Fargion (more info about the performance here). The performance has been recorder in video and the skeleton tracking data was also recorded in a database.
In the performance we watch to a gesture dialogue between both performers. Inspired by “William Forsythe: Improvisation Technologies” I decided to use lines to join different body joints to extract graphical patterns based on the their gestures in time.

Using has a starting point the data parser made by Florian Jenett in Processing, I started a series of experiments connecting different joints with lines (hand with hand, head to both hands, …) and watching its motion over the time.

In the end I got some interesting visual patterns. A geometrical graphical dialogue, a visual abstraction of the original performance.
Bellow two posters showing different geometric sequences. Check more in here.

 

/ Below a video render with several different visualizations:

The processing files used can be downloaded in my Github account here.
*NOTE: They are based on the initial processing files by Florian Jenett. It uses a Library (de.bezier.guido.*) that has not been updated to Processing 2.0. So, run it in Processing 1.5

 

// GRAVITATIONAL FIELD

Some participants of the lab built an interactive dance space that was continuously broadcasting  motion data (live and recorded). I wanted to use that live data so I started to work on an interactive visualisation.

I wanted to do not just a visualisation of the movement, but an interactive system that receive info from the dancer and gives him something back to interact/play and by this way influence his movements. Something in the borderline of a visual tool and a gaming experience.

So I used toxic.physics library to developed a gravitational particle system with a central force in the middle of the screen that attracts or repels the particles, and 3 more additional forces to interact with. One for the mouse (for debug proposes), and two more F1 F2 (one for each hand).


/ Below some still pictures from the gravitational system. Check more picture in my flickr gallery here.

During the development of this system I used the motion data that was being broadcaster.
Unfortunately I didn´t have time to test it properly with a performer directly interacting with the visualisation, but I intent to do it in a near future.

/ Below some pictures from the lab with the application receiving live motion data

/ Below a screen capture from the system receiving live motion data:

 

You can download the “Processing” files here. It has a GUI where you can control the amount of street of each force (attract/repel). I created two forces to interact, and I imagined that they would be controlled by the hands of the performer. (position data comes in by OSC msg

 

For more information and updates about the Choreographic Coding Lab keep track of NODE FB page and MotionBank site, FB, and Twitter , where  more documentation and project reports from other participants will be posted in the future.

Read more

Boris Archives #002

CODE, EXPERIMENTAL

No Comments

“Kepler Terrain” is a Quartz Composer patch made for BorisChimp504 A/V live performances.
It illustrates a vision from Boris’s space ship flying over Kepler 22b’s surface.
It was made using Rutt/Etra plug in by Vade.

“Rutt Etra” was an analogue video synthesiser, created by Steve Rutt and bill Etra in the 70s, for real time manipulation of the video signal.
This synthesiser allowed the user to play with the video sync lines oscillator, control offset and “z-displace”.
It was used by several video artists for visual creation. One of the most kiss ass visuals is “SCAN PROCESSOR STUDIES” by Woody Vasulka and Brian o’Rielly.

For “Kepler Terrain” patch I used as source a Perlin Noise texture that is constantly moving in a way to simulate the mountains. But you can use any image source you like (check the image input ).

Then I changed a bit the rotation and point of view (check the 3dTransform block) and inserted audio information in the Z-Extrude parameter, in order to make the mountains go up and down reacting to the sound.

You can download “Kepler Terrain” patch here: https://github.com/visiophone/quartz.
Inside you will find notes with all the informations. You will need Rutt/Etra plug in.
You can see it working above in Boris’s “Mission to Kepler” teaser:

Read more

Boris Archives #001

CODE, EXPERIMENTAL

No Comments

Space Creature is a Quartz Composer patch made for BorisChimp504 live A/V. It’s built from the  several replications of GL Sline. You can download it here: github.com/visiophone/quartz.

The patch is ready to use in VDMX (it’s the way I use it in live performances). So I map all the parameters there (some reacting to audio analyse and others controled by hand with a midikeyboard). If you prefer you can do that inside quartz itselft (with the “Audio Input patch”).You will need Kineme GL plugins

Space Creatures can be seen in action in this video from last May in “Imaginarius” festival.

Read more

LEAP Motion 003

CODE, EXPERIMENTAL

No Comments

Exploration of 3 gesture audiovisual instruments for Leap Motion.

Made in Processing,

Using LeapMotionP5 library for Processing (github.com/mrzl/LeapMotionP5) by onformative and Minim sound Library.
github.com/visiophone/Leap-AudioVisualInstrument

Read more