Real time audiovisuals powered by a skateboard


Enboard is an audiovisual experimentation, a meeting point of our admiration towards art, music, technology and skateboarding, elements which have met in the past and that once again collide in a project that only seeks to explore, create and experiment.

The idea started as an academic development, that grew bigger, being selected as a Music Hack Day winner, during Sonar Festival, 2015. Here, we joined forces with #MusicBricks, who gave us a helping hand, including lending us their R-IoT technology, sensor on which Enboard is mainly based upon.

The experiment is a shot at generating a live audiovisual experiment, based and feed from the act of skateboarding. Every sound generated is registered using a contact microphone attached to the board’s base,and graphics are controlled via the sensor’s info, which detects jumps, spins and other movements. Video is registered in real time, using a cellphone to stream images.

We believe skateboarding, like most human activities, has unique expressive capabilities, mediated by the body, sounds and social dynamics. The three key points: sound, image and what happens between the skateboarder and the cameraman are the basis of this project.


A tribute to skateboarding


We always wanted to use skateboarding’s own sounds. Every place inhabited by skateboarders, a plaza, a street, a park, has its own particular soundscape. Riding over pavement, the rhythm of every crack, rusty wheels and bearings, loose trucks, every bit counts. Jumping, falling, grinding, power sliding, every little noise adds up, inspiring us to include it and use it as basis for a soundscape that speaks of skateboarding.


For graphics, we use real-time video. Video is a big part of skateboarding. Anyone capable of riding and filming is as respected as the ones that are being filmed. This permanent presence of a camera in skateboarding culture was what we wanted to point out. Using a live video feed from a cellphone’s camera, the system receives its stream, and with the data we receive from the board, we distort the image; it sorta like vying in real time, but controlled by a skateboarder and its tool. The use of a cellphone gave us the freedom of mobility, no cables meant a more natural dynamic, similar to the ones present in the streets, giving total freedom ro film any movement from any point of view. Close, far, moving or standing still, the videographer is transformed into an active ingredient for our performance.


R-IoT sensor, using Max Msp

The R-IoT sensor is capable of transmitting data via Wifi, with very little delay. It comes with a Max patch which has a series of objects that pre-process information, which generate an easy-to-use quality feed of gyroscope, speed and other movement behaviors. Some of these objects which we use are: shake, kick and energy. We also use raw gyro data, and other objects generated by us to translate sensor data into skateboarding behaviors mapped to our audiovisual assembly.


The kick object detects sudden changes in the skateboard’s energy, for example, when the sensor is moving and suddenly stops. I can be calibrated to be sensible to certain changes. In our case, we used it to detect when the skateboarder jumps, which is fairly easy, due to the fact that when this happens, a particular kind of kick is launched. This kick is used as a bang that increases exponentially the amount of blur in the scene.


Shake is a measure similar, but not quite the same, as an average for x, y and z axis. This measure basically represents who much the board is moving (or better yet, shaking), and is used to control various parameters that are fed by constantly fluctuating values, for example, as used by the amount of delay we inject into the graphics.

Likewise, energy readings, which also are divided into each individual axis, are used to measure the force in which the board moves in certain directions. This is particularly useful when flipping the board and transmitting it into audiovisual representations. for example, a kickflip (a full flip along the z-axis) will cause a delay in one of the color channels.


The Manual object is an adaptation of gyro values using one of the axis to determine wether the board is on two wheels or not, and if they are the front or back ones. It works as a bang, and can be also be measured in time, which allows us to be used as a trigger or as a more complex influx of data.


Juan Felipe Gómez //Creative coder

Interactive Media Designer, frontend developer and creative coder. Music, art and technology fanatic, devoted to anything that manages to unite all three.

Mudcircles //Experimentation project

Independent artistic project, that, since 2011, seeks to explore emotions and oscilating instants of our essence, memory and human condition, as a resource for photography, video & short films, motion graphics, illustration and collage.