Generation of Aesthetic Emotions Guided by Perceptual Features
We present an experimental prototype that aims to study the use of aesthetics-related features from visual and auditory domains to express a set of 13 emotions. In the visual domain features are unfolded with a chance of occurring according to their perceptual relevance, whereas in the auditory domain there is a previous categorization of emotions. In the end this will result in a series of digital abstract faces expressing certain emotional states.
Keywords: Aesthetics, Emotions, Abstract Faces, Generative Design, Perceptual Features
Background
It is known that several visual aspects may influence the induction of emotions. For instance, brighter colors have been linked to positive emotions, whereas darker colors have been linked to negative emotions [4]. Other associations have been studied by Cavanaugh [1]. Understanding how to evoke a certain emotion through sound [3] and image [4] is a crucial point for the development of design artefacts based on non-verbal communication. Nevertheless, this issue is still largely unexplored by design sciences. We argue that the expression and communication between these seemingly distinct domains is easier and more comprehensible through the development of a perceptually relevant aesthetical language.
How it works
To every emotion there was a corresponding visual expression, which in turn, was generated in the following way:

(A)The process of building a perceptual alphabet (B); how it is represented, and (C) How probabilities decide facial elements or specific attributes.
P1 — general probability of a certain feature being associated to an emotion; P2 — probability of two or more features of the same type being chosen among themselves.
-
-
-
- To visual features were assigned weights (probability of occurring) according to the emotion in question. (see Fig. 1)
- The visual alphabet was composed by high-level features (density, texture, complexity), low-level features (shape, size, color), and manipulations (motion, repetition, symmetry).
- Inspired by Chernoff faces [2] (see Fig. 2) modularity and nature, we generated several digital faces with properties guided by our previously mentioned visual alphabet.
Figure 2 Example of Chernoff faces. Image rights belong to [2].
- Music was subject to a previous emotion categorization based on the piece character.
- Faces were then generated to express a certain emotion.For example, for the feeling of Anger the following visual features are considered perceptually relevant: red, black, sharp shape, triangle, big size, movement up, wavy line, high density, and medium line length. (see Fig. 3)
-
Figure 3 Some of the facial expressions generated. (A) Initial face (neutral); (B) Anger; (C) Happiness; (D) Fear; (E) Excitement; (F) Sadness; (G) Calm; (H) Dignity; (I) Expectation.
Although significant bibliographic research and experimentation has been done in this work, it must be subject to continuous updates in the future. We believe that the use of IEC (Interactive Evolutionary Computation) is important to evaluate the relevance of specific features regarding a specific emotion.
References
[1] Cavanaugh, L. A., MacInnis, D. J., & Weiss, A. M. (2016). Perceptual dimensions differentiate emotions. Cognition and Emotion, 30(8), 1430– 1445.
[2] Chernoff, H. (1973). The use of faces to represent points in k-dimensional space graphically. Journal of the American Statistical Association, 68(342), 361–368. http://doi.org/10.1 080/01621459.1973.10482434
[3] Juslin, P. N. (2013). From everyday emotions to aesthetic emotions: Towards a unified theory of musical emotions. Physics of Life Reviews. http://doi.org/10.1016/j.plrev.2013.05.008
[4] Lindborg, P., & Friberg, A. K. (2015). Colour association with music is mediated by emotion: evidence from an experiment using a CIE lab interface and interviews. PloS One, 10(12), e0144013 -
Publication
-
A. Rodrigues, A. Cardoso, and P. Machado, “Generation of Aesthetic Emotions guided by Perceptual Features,” in Proceedings of the Ninth International Conference on Computational Creativity, Salamanca, Spain, June 25-29, 2018, 2018, p. 306.
- Bibtex
- |
@inproceedings{rodrigues2018aesthetic,
author = {Ana Rodrigues and Am{\'{\i}}lcar Cardoso and Penousal Machado},
editor = {Fran{\c{c}}ois Pachet and
Anna Jordanous and
Carlos Le{\'{o}}n},
title = {Generation of Aesthetic Emotions guided by Perceptual Features},
booktitle = {Proceedings of the Ninth International Conference on Computational
Creativity, Salamanca, Spain, June 25-29, 2018},
pages = {306},
publisher = {Association for Computational Creativity {(ACC)}},
year = {2018},
url = {http://computationalcreativity.net/iccc2018/sites/default/files/papers/ICCC\_2018\_paper\_71.pdf},
timestamp = {Thu, 12 Mar 2020 11:34:15 +0100},
biburl = {https://dblp.org/rec/conf/icccrea/CardosoRM18.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}