No Comments

Inspired and built over the code of Daniel Shiffman‘s CodingRainbow tutorial on SuperFormulas, and the work of Reza Ali.

I made an app on Processing to generate and control supershapes on realtime, which I used to perform live visuals at BoomFestival2016 (used during the set of Krumelur at the Alchemy Circle Stage)

Find the code here on my github.

It is still an unfinished job, with a big to-do list. But it is stable an fun. It sends signal through Syphon so you can mix it with other visual content, FXs and software. Audio amplitude changes color cycles. I sent the Syphon’s signal to VDMX, and then mixed with QuartzComposer compositions.
It has (on SaveShape TAB) an array to store shapes, which will loop then autonomously with the function shapeShift(int x) active.

Works on Processing 3. Needs Syphon, PeasyCam, processingSound and controlP5 Processing’s libraries.

Explore it, Improve it, Share it and have Fun.



// Some pictures from the Alchemy Stage with the SuperShapes being projected.


// Some Supershapes’s stills.

supershape. Control GUI
// SuperShapes control GUI, with parameters to modify the shapes, and the bottom to trigger autonomous shape-shifter function.

Read more



No Comments

Again this year I spent 3 month in UT Austin as a research visitor.
Below a list of projects in which I was involved while there.


Interactive visuals for a dance performance. Presented in March 2015 at UT Austin – Oscar G. Brockett Theatre, Austin/TX. More details here.


Visuals for a dance performance for a Scion car reveal event. March 31, New York.
Collaboration with Joao Beira, Quixotic, Erox Biox and O2CreativeSolutions.



[3] OMNIMOTION @SXSW Interactive 2015, by Joao Data.
Helping João with some computer vision and programming to give interactivity to the installation in order to make the faces follow the visitors around the park.


[4] TALKS sharing my work at:
EYEON Life Performers meeting” + “Design BFA class, UT Austin, Prof. Colin Frazer”;
Protos X SxSW2015” + “CMTalkSeries”, San Antonio



Live reactive visuals for a dance performance. Collaboration with Eli Fieldsteel, Billie Rose Secular and Ladonna Matchett.















Read more

Boris Chimp 504’s Audiovisual Explorations


No Comments

Recent visual explorations for a future Audiovisual Album.
All graphics generated in real time with Quartz Composer

Keep track of Boris Chimp 504’s space adventures at: | |

boris chimp 504 _ Visual explorations 2015

boris chimp 504 _ Visual explorations 2015

boris chimp 504 _ Visual explorations 2015

boris chimp 504 _ Visual explorations 2015

Read more



No Comments

Between January and April (2014) I spent 4 months at UT Austin as a research visitor. One of my main goals was to get the opportunity to collaborate in multidisciplinary projects with local students at the Theater and Dance department, that I could later user as case studies for my research.

Below  list of projects in which I was involved ::


Interactive audiovisual dance performance presented at the Ears Eyes and Feet event in the B. Iden Payne Theater, May 2014, UT Austin Texas. More info at the project page.

[Rodrigo Carvalho: Interactive Visuals, Yago de Quay: Dance, Voice, and Music Composition, Sunny Shen: Dance and Choreography, Po-Yang Sung: Lighting]

[Final moment of the performance, where virtual particles on the screen are magically transformed into physical ones ones


“Warning: A Wearable Electronic Dress Prototype” is the result of a series of explorations in possible interactive/reactive technologies for a stage performance costume. More info at the project page.

[Concept and prototype development: Kristen Weller and Rodrigo Carvalho;]


Exploration of sound visualizations at “Stallion”, a 328 Megapixel Tiled Display System at TACC (Texas Advanced Computing Center). More info at the project page.


// Apart from the university related projects, I got introduced to some of the local Audiovisual scene, and I had the opportunity to perform some live visual sessions.


Live Visuals for 4 live rock bands at the Alamo Cinema, with with PROTOS FESTIVAL. [Wreckmeister Harmonies, B L A C K I E, Indian Jewelry, and Spray Paint!!!!]

[Video from the event]

[Teaser with Indian Jewelry’s music]
[6] ST37
Live visuals for ST37, a legendary experimental/space/rock/psychedelic from Austin.

Read more



No Comments

Some extracts from my live visuals for the BEYOND THE GATE event.

Visuals for 4 live rock bands at the Alamo Cinema / Austin / Texas. 16.03.2014.

“Expanded cinema, experimental event, skull shattering sensory overload; call it what you want, but this is an evening not to be missed. We here at the Alamo have teamed up with Strange Victory Touring as well as our friends Protos Festival, Atomic Picnic, and Portugal’s visual artist Rodrigo Carvalho to entangle you in transformative aural/visual resonance. Drawing heavily on occult imagery, psychedelia, and the dark corners of the peripheral, we invite you to join us for one night only with our guests, Wreckmeister Harmonies, B L A C K I E, Indian Jewelry, and Spray Paint!!!!”

Read more



No Comments

Between October and December I took part of an artistic residence at 1ºAvenida building, at Porto/Portugal.
I had a studio there, and I used that time and space to continue my work and research on interactive visuals and on the relations between Sound, Visuals and Movement. Below there is a list with some of the highlights and outcomes from those 3 months.


Sound and vision explorations with the goal of the creation of audio visual content for the Boris Chimp 504‘s performance. AudioReactive visuals made on Quartz Composer. More pictures at the flickr gallery.

(Some stills from the generated visuals)

(Video Teaser)


Video animations mapped into hand drawings from Porto’s landscapes. Check the project page for details.Made with Jose Cardoso.


Exploration of an interactive system between movement and visuals. Prototype for a possible future project. The body is tracked with a kinect 3d camera and the visuals are generated with Quartz Composer. Work in progress, made in Quartz Composer.


I organized a small informal conference focused on the relations between Sound, Visuals, Movement in real-time and interactive audiovisuals. I invited 3 artist, João Beira, Diogo Tudela and Filipe Lopes.
Everyone one of them (and me) gave a lecture related to their work.

(Some pictures from the lectures)


Interactive Audiovisual Installation. Visualisation of a group of particles who live inside an audio-reactive simulated physics system.
The system is composed by several gravitational fields that react to sound. The attract/repel forces of each gravitational field are related to the sound frequencies (analysed in real-time). The forces make the particles move around the system, creating this way a visual relation between the sound and the particles’ motion. More info at the project page.


A physical interface controls a series of LED lights placed over a hand drawings of Porto’s landscapes placed on the wall of the exhibition room. Pieces of carton board over layers hide the LED’s and  provoke a game between lights and shadows over the drawings. Made with Jose Cardoso.


On January 4th I performed, together with Miguel Neto, our  A/V live act BORIS CHIMP 504. An audiovisual real-time performance that emphasizes audio synthesis and graphical languages in a futuristic Sci-Fi aesthetics. It’s a real time interactive/reactive system between the audio and the image, between the man and the machine.

(Picture from the performance)

(Video Teaser)

Read more



No Comments

“2V-P is a live visual performance tool designed by Ali M. Demirel and engineered by Pascal H. Lesport, as the outcome of their artistic collaboration.

2V-P aims to manifest the idea of a minimalist visual concept and an interactive performance technique which Demirel has developed through his live show experience.”



In the last couple days I have been doing a test drive with “2V-P”, a new tool for live performance, which is basically a Quartz Composer 2 channel mixer.

It uses as source only Quartz Composer’ patches, and you are able to manipulate them in real time, which means that you have access to all the parameters of the patches from the 2v-p interface. The strongest idea behind 2vp is the minimal/”keep it simple” logic. Focusing its functionalities on getting the best of the live performance experience.

Although of the “keep it simple” mentality , the fact that it is uses Quartz Composer patches and is able to send and receive OSC/MIDI basically means that you can to do almost anything with it. Working with images,sound, live feed, text, video, generative graphics,…. and connect it with any software/Interface that has OSC/MIDI. Useful for Vj, audio reactive lives, dance interactive performances, installations, and a big etc.


Above in the interface’s detail you can see one of the channels. On the top a preview window, showing the content on that window, and in the bottom 2 sliders controlling the parameters of the visual patch.

Check 2V-P at  and basic tutorial here.
If you never used Quartz Composer before and you want a “Getting Started Guide” check the compilation of fundamental links here.
While I was testing it with some audio-reactive QC’s patches and listening to some techno sound, I got enthusiastic and I screen grabbed some moments, and edited it later on a video.  Watch it below:

[Track: Plastikman – Gak (sheet one, 1992)]

Read more



No Comments

Last November (25th-29th) I took part of the Choreographic Coding Lab, from the MotionBank Project. An exploratory laboratory focused on translating aspects of dance and choreography into digital forms.

Below there are the two project/prototypes that I was working during that week.


I started by digging into the motion data from the performance of Jonathan Burrow’s and Matteo Fargion (more info about the performance here). The performance has been recorder in video and the skeleton tracking data was also recorded in a database.
In the performance we watch to a gesture dialogue between both performers. Inspired by “William Forsythe: Improvisation Technologies” I decided to use lines to join different body joints to extract graphical patterns based on the their gestures in time.

Using has a starting point the data parser made by Florian Jenett in Processing, I started a series of experiments connecting different joints with lines (hand with hand, head to both hands, …) and watching its motion over the time.

In the end I got some interesting visual patterns. A geometrical graphical dialogue, a visual abstraction of the original performance.
Bellow two posters showing different geometric sequences. Check more in here.


/ Below a video render with several different visualizations:

The processing files used can be downloaded in my Github account here.
*NOTE: They are based on the initial processing files by Florian Jenett. It uses a Library (de.bezier.guido.*) that has not been updated to Processing 2.0. So, run it in Processing 1.5



Some participants of the lab built an interactive dance space that was continuously broadcasting  motion data (live and recorded). I wanted to use that live data so I started to work on an interactive visualisation.

I wanted to do not just a visualisation of the movement, but an interactive system that receive info from the dancer and gives him something back to interact/play and by this way influence his movements. Something in the borderline of a visual tool and a gaming experience.

So I used toxic.physics library to developed a gravitational particle system with a central force in the middle of the screen that attracts or repels the particles, and 3 more additional forces to interact with. One for the mouse (for debug proposes), and two more F1 F2 (one for each hand).

/ Below some still pictures from the gravitational system. Check more picture in my flickr gallery here.

During the development of this system I used the motion data that was being broadcaster.
Unfortunately I didn´t have time to test it properly with a performer directly interacting with the visualisation, but I intent to do it in a near future.

/ Below some pictures from the lab with the application receiving live motion data

/ Below a screen capture from the system receiving live motion data:


You can download the “Processing” files here. It has a GUI where you can control the amount of street of each force (attract/repel). I created two forces to interact, and I imagined that they would be controlled by the hands of the performer. (position data comes in by OSC msg


For more information and updates about the Choreographic Coding Lab keep track of NODE FB page and MotionBank site, FB, and Twitter , where  more documentation and project reports from other participants will be posted in the future.

Read more

Selected highlights from ACM Multimedia 2013 / Barcelona


No Comments

[1] “MultiSensory Mixed Reality with smell and taste”, Adrian Cheok.

(this video is not from ACMM13, but it is a similar talk)


[2] “Tracking-based Interaction for Object Creation in Mobile Augmented Reality”, Wolfgang Hürst, Joris Dekker.

Paper at ACMM13 proceedings, teaser video


[3] “Facilitating Fashion Camouflage Art”, Ranran Feng, Balakrishnan Prabhakaran

Paper at ACMM13 proceedings

[4] “Gesture–based Control of Physical Modeling Sound Synthesis: a Mapping-by-Demonstration Approach”, Jules Françoise, Norbert Schnell,Frédéric Bevilacqua

Paper at ACMM13 proceedings


[5] “Context-Aware Gesture Recognition in Classical Music Conducting”, Álvaro Sarasúa

Paper at ACMM13 proceedings

Read more

Boris arquives #003


No Comments

At Boris Chimp 504 we have been working on A/V album to be released soon. We have been spending some time building a narrative for the album and exploring new sounds and visions. Below you can see some pictures from these recent explorations. Find more at the flickr gallery and FB page.

Read more