Olhos music fest _branding
This project regards the creation of an interactive brand for Olhos music fest — a music festival in Portugal. Thanks to sound analysis, computer vision and image rendering techniques, the designs may react to sound and automatically customize to different people.
Introduction
Thanks to the increasing power of modern computers, humankind is facing the golden age of artificial intelligence. As a consequence, computer sensing techniques have been suffering from rapid improvement. Alike other fields, graphic design starts to benefit from such advances, for example, by evolving towards interactive designs which may be more eye-catchy and so be more efficient on the communication with people.
This project regards the creation of an interactive graphic design identity for a music festival in the Portuguese village “Olhos da Fervença” — Olhos Music Fest (Eyes Music Fest).
The main concept behind the branding was passing the energy and feeling of “what it is like to participate in the festival” — an extrovertive mood, with people dancing free.
A visual explanation of the concept behind the logotype of Olhos Music Fest .
Firstly, due to the importance of communicating the name of the festival, a logotype was created (see Figure 1). However, branding and graphic identity may be pushed beyond creating a logo.
Therefore, the next task was understanding how to translate the concept (“what it is like to participate in the festival”) into graphics. In other words: how to visualise the experience. We started by defining the very elementary units that compose the festival — music, space and people — and we used these as starting points to develop the brand. Same way as the Neue Design Studio used information visualisation techniques to create the Norwegian Meteorological Institute’s logo [1] (to create graphics), which reacts to weather statistics, we setted up to use information visualisation to create not only graphics, but (and primarily) animations.
Visualising music:
The more reliable and direct variables one may extract from the sound analysis may be the amplitude and frequency of the audio signal. From these two, the amplitude may be the easiest one for humans to perceive. Thus, that was the one to go. However, because a brand must be unpaired and it was meant to translate the brands’ concept into graphics, the solution to this problem should not be a trivial sound visualiser. Instead, a dancing artefact which reacts to sound’s amplitude was developed — the “dancing free” part of the concept. The artefact mimics a boneless arm (chain) capable to spin itself around infinitely.
Visualising space:
Visualising space may seem redundant unless an overlayer is added, offering a new perspective or further information.
In that sense, using a camera and image rendering techniques, the visual environment was simplified into four ranges of brightness represented by geometric symbols, creating a non-realistic way of visualising the space around. This low definition perspective may be interpreted as a psychoactive experience, passing the “extrovertive mood” part of the concept. This layer was used as a background pattern.
Visualising people:
Because the festivals’ name means “Eyes Music Fest”, the brand could not be completed without making the eyes, the main graphic motif. Though, for people visualisation, only eyes are used (one may recognise a friend, although he/she remains anonymous for the crowds). These were used as the main unit to build the dancing arms, which works over the space visualisation. Changing the inputs will consequently change outputs. Thus, by applying such visualisation techniques into the brand’s graphics, it was possible to create an interactive and personalisable brand.
How it is made
The dancing eyes (music and people visualisation):
The development was started by using Processing’s OpenCV library to detect eyes on images/frames. However, it is likely for this algorithm to miss-detect other objects rather than eyes. For that reason, our solution consisted of firstly detecting faces and then looking for eyes inside the detected regions of interest.
Eye-chain creation on a photo of Dalla Marta — a band playing in the festival.
Once proper eye detection was achieved, an chain/array of eyes is created for each eye that has been found. Each eye in the chain is placed relating to the one before (its parent), with a linear distance in between (see Figure 2).
After that, each eye was set to move autonomously and constantly around its respective parent, with a linear speed (linear angle increment), backwards or forwards depending on 50% probability. Also depending on a set probability, each element may change its moving direction (backwards or forwards).
Now the eyes-chains are able to move, these needed to be moving on rhythm — “dancing”. Thus, the current sound amplitude detected is added to the spinning velocity (“moving angle”) of each of the eyes. That means the tips of the chains move faster than their origins and each eye move as faster as the song’s amplitude increases. (see the Videos below).cThanks to the constant motion approach referred before, the eye-chains never stop “dancing” even if there is silence (they only slow down, waiting to be energetic again).
The environment patterns (space visualisation):
The goal of the space visualisation was to pass some information about the objects in the camera (for instance, identify darker or lighter zones, and even some silhouettes). Yet, these graphics should be abstract enough to allow open interpretations, so the viewer may fell away from the real and objective world. In that sense, the image filter was constructed using geometric symbols.
Low-resolution patterns created using the pixel-brightnesses in the picture of a person. These are typically used as background patterns.
Eye-chain over its background pattern.
Firstly, the brightness of each pixel in an equally spaced grid was picked. Then, the brightness spectrum was divided into four ranges. For each of these, it was associated with a symbol (or lack of) which density reflects the respective brightness level (see Figure 3). The images generated using this module may be used as a background for the eye-chains, generated before (see Figure 4).
Because the system may work in real-time, it may have great potential to be used live on VJ performances, interactive posters or any other kind of physical installations in the festival’s venue.
Applications
The generated artefacts may be exported and then used to compose static or animated designs such as videos, posters or merchandising (post-produced). These may also be used in real-time applications such as interactive posters, VJ performances or photo-spots in the festival’s venue.
Regarding post-produced designs, two main artefacts were developed — an animated one and a non-animated one.
The non-animated one was a printed poster showing the artists playing int the event, side by the respective day and time. The festival’s logotype was places in the top and it was complemented with the generated graphics (see Figure 5, 6 and 7).
Mockup of the festival’s poster.
Closeup of the generated images on the printed festival’s poster.

Closeup of typographic details on the printed festival’s poster.
The animated element was a Facebook banner in which, as well as in the poster, the artists where showcased. However, in this digital medium it was possible to use all the potential of the sound, space and people visualisations. The eyes of the elements of bands were picked. Then, these were made to dance at the rhythm of a song (from Whales — one of the bands). Because static images were being used in the banner (photos from the artists), the background was generated out of a video of someone dancing (see the Video “Olhos music fest — Facebook banner”, on the top of this post).
Inspired by works as “Oto Nové Swiss Poster” [2], “Camera Postura” [3] or “EV NT interactive poster wall concept” [4], the proposed real-time application was an interactive poster. The Video bellow shows a digital version of the poster in which the static image was replaced by the output of the real-time algorithm using a webcam as an input. This way, the poster may customise itself to any person or group of people looking at it (see the Video “Olhos music fest poster”, on the top of this post).
As mentioned before, the algorithm may be used in the creation of the most varied applications, such as VJ performances, by using a camera on stage, pointing at the musicians or the audience (see the Video below).
References
[1] Armin, “Where the Cold Wind Blows”, 29 September 2010. [Online]. Available: www.underconsideration.com/brandnew/ archives/where_the_cold_wind_blows.php
[2] Studio Feixen and J. Giger, “Oto Nové Swiss Poster,” 9 March 2017. [Online]. Available: www.otonoveswiss.co.uk. [15/03/2018].
[3] LUST, “Camera Postura,” LUST, 2014. [Online]. Available: https://lust.nl/#projects-5939. [15/03/2018].
[4] Peder Eskild, Silje Barth and Aleksander Wassum, “EV NT interactive poster wall concept”, 2012. [Online]. Available: https://vimeo.com/43598671
Publications
-
D. Lopes, P. Martins, and P. Machado, “Olhos Music Fest _Branding,” in 2018 22nd International Conference Information Visualisation (IV), 2018, pp. 510-511.
- Bibtex
- |
- Link
- |
@inproceedings{lopes2018olhos,
author = {Daniel Lopes and Pedro Martins and Penousal Machado},
booktitle = {2018 22nd International Conference Information Visualisation (IV)},
title = {Olhos Music Fest _Branding},
year = {2018},
pages = {510-511},
doi = {10.1109/iV.2018.00094},
ISSN = {1550-6037},
month = {July}}
External links
Olhos music fest on Research Gate
Olhos music fest on Behance
Date
01/07/2018