Facing Interaction was the research project that arose from my second internship at Microsoft Research, where I collaborated with Dr. Qin Cai in the creation of several prototypes of interactive artworks. In this project we explored the question of how can we communicate sensations and emotional states, in non-verbal ways, to others, to ourselves, to objects, or even to places.
Under the assumptions that facial gestures could constitute a window to somebody’s emotion, the prototypes aimed at help reflecting on the poetics of non-verbal communication.
This is a very simple prototype of a music instrument that can be controlled by the user's face, with the idea of exploring how facial gestures could be translated into musical ones.
Using the metaphor of signing, the user triggers sounds (MIDI messages sent to Ableton Live) by opening his or her mouth.
The rotation of the head allows to select which note–within Am pentatonic scale)–is played, thus freeing the hands of the performer.
Look at me!
This prototype explored the projection of emotional states onto objects by embedding a simple behaviour in a plastic cup.
By tracking the user's head we can simulate that the object wants to be looked at and would react negatively to the lack of attention.
Several other prototypes were created, including a vest with vibrating motors that would indicate where the user is being looked at (by vibrating in function of the estimated gaze) and a virtual choir with eight-channels of facially-controlled sound.
My piece Walrus 2016, presented at ISEA 2016, also started as one of the prototypes.