VISIOPHONE

Boris Chimp 504’s Audiovisual Explorations

EXPERIMENTAL, REPORT

No Comments

Recent visual explorations for a future Audiovisual Album.
All graphics generated in real time with Quartz Composer

Keep track of Boris Chimp 504’s space adventures at:
www.borischimp504.com | www.facebook.com/borischimp504 | www.twitter.com/borischimp.com

boris chimp 504 _ Visual explorations 2015

boris chimp 504 _ Visual explorations 2015

boris chimp 504 _ Visual explorations 2015

boris chimp 504 _ Visual explorations 2015

Read more

REPORT FROM THE CHOREOGRAPHIC CODING LAB, MOTIONBANK. FRANKFURT 2013

CODE, EXPERIMENTAL, REPORT

No Comments

Last November (25th-29th) I took part of the Choreographic Coding Lab, from the MotionBank Project. An exploratory laboratory focused on translating aspects of dance and choreography into digital forms.

Below there are the two project/prototypes that I was working during that week.

1// LINES

I started by digging into the motion data from the performance of Jonathan Burrow’s and Matteo Fargion (more info about the performance here). The performance has been recorder in video and the skeleton tracking data was also recorded in a database.
In the performance we watch to a gesture dialogue between both performers. Inspired by “William Forsythe: Improvisation Technologies” I decided to use lines to join different body joints to extract graphical patterns based on the their gestures in time.

Using has a starting point the data parser made by Florian Jenett in Processing, I started a series of experiments connecting different joints with lines (hand with hand, head to both hands, …) and watching its motion over the time.

In the end I got some interesting visual patterns. A geometrical graphical dialogue, a visual abstraction of the original performance.
Bellow two posters showing different geometric sequences. Check more in here.

 

/ Below a video render with several different visualizations:

The processing files used can be downloaded in my Github account here.
*NOTE: They are based on the initial processing files by Florian Jenett. It uses a Library (de.bezier.guido.*) that has not been updated to Processing 2.0. So, run it in Processing 1.5

 

// GRAVITATIONAL FIELD

Some participants of the lab built an interactive dance space that was continuously broadcasting  motion data (live and recorded). I wanted to use that live data so I started to work on an interactive visualisation.

I wanted to do not just a visualisation of the movement, but an interactive system that receive info from the dancer and gives him something back to interact/play and by this way influence his movements. Something in the borderline of a visual tool and a gaming experience.

So I used toxic.physics library to developed a gravitational particle system with a central force in the middle of the screen that attracts or repels the particles, and 3 more additional forces to interact with. One for the mouse (for debug proposes), and two more F1 F2 (one for each hand).


/ Below some still pictures from the gravitational system. Check more picture in my flickr gallery here.

During the development of this system I used the motion data that was being broadcaster.
Unfortunately I didn´t have time to test it properly with a performer directly interacting with the visualisation, but I intent to do it in a near future.

/ Below some pictures from the lab with the application receiving live motion data

/ Below a screen capture from the system receiving live motion data:

 

You can download the “Processing” files here. It has a GUI where you can control the amount of street of each force (attract/repel). I created two forces to interact, and I imagined that they would be controlled by the hands of the performer. (position data comes in by OSC msg

 

For more information and updates about the Choreographic Coding Lab keep track of NODE FB page and MotionBank site, FB, and Twitter , where  more documentation and project reports from other participants will be posted in the future.

Read more

Boris Archives #002

CODE, EXPERIMENTAL

No Comments

“Kepler Terrain” is a Quartz Composer patch made for BorisChimp504 A/V live performances.
It illustrates a vision from Boris’s space ship flying over Kepler 22b’s surface.
It was made using Rutt/Etra plug in by Vade.

“Rutt Etra” was an analogue video synthesiser, created by Steve Rutt and bill Etra in the 70s, for real time manipulation of the video signal.
This synthesiser allowed the user to play with the video sync lines oscillator, control offset and “z-displace”.
It was used by several video artists for visual creation. One of the most kiss ass visuals is “SCAN PROCESSOR STUDIES” by Woody Vasulka and Brian o’Rielly.

For “Kepler Terrain” patch I used as source a Perlin Noise texture that is constantly moving in a way to simulate the mountains. But you can use any image source you like (check the image input ).

Then I changed a bit the rotation and point of view (check the 3dTransform block) and inserted audio information in the Z-Extrude parameter, in order to make the mountains go up and down reacting to the sound.

You can download “Kepler Terrain” patch here: https://github.com/visiophone/quartz.
Inside you will find notes with all the informations. You will need Rutt/Etra plug in.
You can see it working above in Boris’s “Mission to Kepler” teaser:

Read more

Boris Archives #001

CODE, EXPERIMENTAL

No Comments

Space Creature is a Quartz Composer patch made for BorisChimp504 live A/V. It’s built from the  several replications of GL Sline. You can download it here: github.com/visiophone/quartz.

The patch is ready to use in VDMX (it’s the way I use it in live performances). So I map all the parameters there (some reacting to audio analyse and others controled by hand with a midikeyboard). If you prefer you can do that inside quartz itselft (with the “Audio Input patch”).You will need Kineme GL plugins

Space Creatures can be seen in action in this video from last May in “Imaginarius” festival.

Read more

Controling Video with Face Movements – FACE OSC + VDMX

EXPERIMENTAL

No Comments

Testing Face OSC to control and Manipulate video Loops in VDMX

I have some ideas about an audiovisual performance where an actor/performer/singer would be manipulating sounds and visuals with his face while sings, recites a text or just move his head.

So I made quick test-drive on Kyle’s FaceOSC to check how could it work.

Face UP/DOW controls video speedRate
Jaw controls HUE
Face FRONT-BACK controls MSA-BadTV effect
Hide/Show face, triggers a new video Loop

FaceOSC by Kyle Mcdonald https://vimeo.com/26098366
MSA-BadTv effect by Memo http://vdmx.memo.tv/qcfx/memo_bad_tv
get VDMX here: http://vidvox.net/

Read more

LEAP Motion 003

CODE, EXPERIMENTAL

No Comments

Exploration of 3 gesture audiovisual instruments for Leap Motion.

Made in Processing,

Using LeapMotionP5 library for Processing (github.com/mrzl/LeapMotionP5) by onformative and Minim sound Library.
github.com/visiophone/Leap-AudioVisualInstrument

Read more

LeapMotion 002

EXPERIMENTAL

No Comments

Second experiment with LeapMotion device. [leapmotion.com]

A quick draft for an AudioVisual Instrument.
The position of the fingers controls the visuals and triggers Midi Notes

Made in Processing,
Using LeapMotionP5 library for Processing (http://www.github.com/mrzl/LeapMotionP5) by onformative
And MIDI Bus library (http://www.smallbutdigital.com/themidibus.php) by Sparky

*Processing source can be found here :http://www.visiophone-lab.com/data/_exp/leap007_sound.zip
(don’t forget to add the libraries )

Read more

LeapMotion 001

EXPERIMENTAL

No Comments

Yesterday I received my Leap Motion Dev board device.
Leap Motion tracks you hands/fingers’ movements and gestures to use it as a control to interact with the computer.
More info here: https://www.leapmotion.com

The first thing I did was try it in Processing with the LeapMotionP5 library.
http://www.github.com/mrzl/LeapMotionP5
It worked fine, and it comes with some examples (track fingers, track gestures).
Then I made a quick test of drawing some circles with my fingers.

Read more