Facing Interaction

Facing Interaction was the research project that arose during my Fellowship at Microsoft Research, where I collaborated with Dr. Qin Cai in the creation of several prototypes of interactive artworks. In this project we explored the question of how can we communicate sensations and emotional states, in non-verbal ways, to others, to ourselves, to objects, or even to places.

Under the assumptions that facial gestures could constitute a window to somebody’s emotion, the prototypes aimed at help reflecting on the poetics of non-verbal communication.

Facial Pentatonic

A very simple prototype of a music instrument that can be controlled by the user's face, with the idea of exploring how facial gestures could be translated into musical ones.

Using the metaphor of signing, the user triggers sounds (MIDI messages sent to Ableton Live) by opening his or her mouth.

The rotation of the head allows to select which note –within Am pentatonic scale– is played, thus freeing the hands of the performer.

 

 

Look at me!

This prototype explored the projection of emotional states onto objects by embedding a simple behaviour in a plastic cup.

By tracking the user's head we can simulate that the object wants to be looked at and would react negatively to the lack of attention.

 

 

Several other prototypes were created, including a vest with vibrating motors that would indicate where the user is being looked at (by vibrating in function of the estimated gaze) and a virtual choir with eight-channels of facially-controlled sound.

My piece Walrus 2016, presented at ISEA 2016, also started as one of the prototypes.  

Some of the prototypes were presented at Microsoft Tech Fest 2013, and were also discussed in the paper Facing Interaction, presented at ISEA 2014 (proceedings).

A gaze-controlled vibrating vest. The vest would vibrate indicating where it is being looked at.

A gaze-controlled vibrating vest. The vest would vibrate indicating where it is being looked at.

These projects are described in this talk I gave while at Microsoft Research: https://www.microsoft.com/en-us/research/video/facing-interaction/

virtual choir

virtual choir

 

 

 

 

gaze-controlled vibrating elbows

gaze-controlled vibrating elbows