A data visualisation that turns guitar licks and the guitarist's dynamics into light constellations, in real time.

Mesh Experience



interactive guitar visualisation

Our primary aim was to create a visualisation that would make a musical performance legible. Also, we wanted to illustrate an approach for how live shows could benefit from visual effects that are interactive and generated by the musicians themselves.

All visualisations are a result of real-time sound processing and movement through space. No post production.


enhancing the live performance

Mesh Experience is an audio visualisation that responds to guitar performances in real-time. According to the musical range of a guitar, the visualisation consists of 48 points, each one represents a tone and is identified by a position and a colour. Together the points form a circle which surrounds the performer, note for note.

When a note is being played, the respective point reacts, growing and moving from the musician, depending on the note’s velocity. According to the tone’s volume and intensity, the point remains at the highest amplitude, leaves a mark and disconnects from the circle or just reverts to its origin. Note bendings are expressed by a little twist of the circle.

As the guitarist moves, the visualisation follows his steps, connecting nearby tones that are tonally similar on the fly, with lines of light. The result is a floor filled with stars, which are always shifting, connecting, coming to life and fading away.

Since Mesh Experience was entirely programmed in vvvv (a graphical program language), its parameters can be changed and adjusted in real-time during the live performance or prior to setup.

The process behind mesh experience

the creative journey

The project started as a self-initiated one supervised and supported by our university. We joined an audio-visual course of communication design, but setup our project apart from the curriculum with quite a bit of freedom, because we intended to design an interactive piece.


Being all amateur musicians, we felt the strong need to combine several human senses to create a more holistic experience. We also wanted to make the stage events more transparent and immediate for the audience by visualising the connections between musicians and the music being played.


For the first part of the project, we solely focused on the visual representation of the riffs played. Then our school’s media lab was cool enough to buy us a midi-controller, which allowed us to convert the frequencies of the guitar into midi signals.


For some weeks we kept a playful and explorative attitude at the heart of our approach. We transformed rough sketches into prototypes to guide our next steps, allowing us to experience the core of the idea, test, evaluate and learn. At all times, we used vvvv to turn midi signals into visual graphics.

Once we found the most expressive visual representation, we turned to the staging phase. Projecting on a wall never felt right, so we put the guitarist in the centre of the projection, creating a system that keeps up with the performer.

We gathered our equipment of two projectors each with custom rigs, an infrared camera for tracking, a powerful computer, our midi modulator and of course the guitar gear.


For some more information, FastCo.Design rocked a feature article on our project.


Wow! You made it this far!
I’m afraid this case study is just about over. But I’ve prepared a few more for you so there’s no need to panic (yet).


Helping a furniture producer design a user-friendly online platform to affirm its digital voice & presence and support the work processes of its teams.


Exploring the power of motion in space: a geometric audio-reactive diptych visualisation.